Dec 02 22:51:51 localhost kernel: Linux version 5.14.0-645.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025
Dec 02 22:51:51 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 02 22:51:51 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 02 22:51:51 localhost kernel: BIOS-provided physical RAM map:
Dec 02 22:51:51 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 02 22:51:51 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 02 22:51:51 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 02 22:51:51 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 02 22:51:51 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 02 22:51:51 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 02 22:51:51 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 02 22:51:51 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec 02 22:51:51 localhost kernel: NX (Execute Disable) protection: active
Dec 02 22:51:51 localhost kernel: APIC: Static calls initialized
Dec 02 22:51:51 localhost kernel: SMBIOS 2.8 present.
Dec 02 22:51:51 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 02 22:51:51 localhost kernel: Hypervisor detected: KVM
Dec 02 22:51:51 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 02 22:51:51 localhost kernel: kvm-clock: using sched offset of 3454882462 cycles
Dec 02 22:51:51 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 02 22:51:51 localhost kernel: tsc: Detected 2800.000 MHz processor
Dec 02 22:51:51 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 02 22:51:51 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 02 22:51:51 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec 02 22:51:51 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec 02 22:51:51 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 02 22:51:51 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 02 22:51:51 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 02 22:51:51 localhost kernel: Using GB pages for direct mapping
Dec 02 22:51:51 localhost kernel: RAMDISK: [mem 0x2d472000-0x32a30fff]
Dec 02 22:51:51 localhost kernel: ACPI: Early table checksum verification disabled
Dec 02 22:51:51 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 02 22:51:51 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 22:51:51 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 22:51:51 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 22:51:51 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 02 22:51:51 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 22:51:51 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 02 22:51:51 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 02 22:51:51 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 02 22:51:51 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 02 22:51:51 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 02 22:51:51 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 02 22:51:51 localhost kernel: No NUMA configuration found
Dec 02 22:51:51 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec 02 22:51:51 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Dec 02 22:51:51 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec 02 22:51:51 localhost kernel: Zone ranges:
Dec 02 22:51:51 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 02 22:51:51 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 02 22:51:51 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec 02 22:51:51 localhost kernel:   Device   empty
Dec 02 22:51:51 localhost kernel: Movable zone start for each node
Dec 02 22:51:51 localhost kernel: Early memory node ranges
Dec 02 22:51:51 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 02 22:51:51 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 02 22:51:51 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec 02 22:51:51 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec 02 22:51:51 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 02 22:51:51 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 02 22:51:51 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 02 22:51:51 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 02 22:51:51 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 02 22:51:51 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 02 22:51:51 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 02 22:51:51 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 02 22:51:51 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 02 22:51:51 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 02 22:51:51 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 02 22:51:51 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 02 22:51:51 localhost kernel: TSC deadline timer available
Dec 02 22:51:51 localhost kernel: CPU topo: Max. logical packages:   8
Dec 02 22:51:51 localhost kernel: CPU topo: Max. logical dies:       8
Dec 02 22:51:51 localhost kernel: CPU topo: Max. dies per package:   1
Dec 02 22:51:51 localhost kernel: CPU topo: Max. threads per core:   1
Dec 02 22:51:51 localhost kernel: CPU topo: Num. cores per package:     1
Dec 02 22:51:51 localhost kernel: CPU topo: Num. threads per package:   1
Dec 02 22:51:51 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec 02 22:51:51 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec 02 22:51:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 02 22:51:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 02 22:51:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 02 22:51:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 02 22:51:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 02 22:51:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 02 22:51:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 02 22:51:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 02 22:51:51 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 02 22:51:51 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 02 22:51:51 localhost kernel: Booting paravirtualized kernel on KVM
Dec 02 22:51:51 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 02 22:51:51 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 02 22:51:51 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec 02 22:51:51 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Dec 02 22:51:51 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Dec 02 22:51:51 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 02 22:51:51 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 02 22:51:51 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64", will be passed to user space.
Dec 02 22:51:51 localhost kernel: random: crng init done
Dec 02 22:51:51 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 02 22:51:51 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec 02 22:51:51 localhost kernel: Fallback order for Node 0: 0 
Dec 02 22:51:51 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec 02 22:51:51 localhost kernel: Policy zone: Normal
Dec 02 22:51:51 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 02 22:51:51 localhost kernel: software IO TLB: area num 8.
Dec 02 22:51:51 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 02 22:51:51 localhost kernel: ftrace: allocating 49335 entries in 193 pages
Dec 02 22:51:51 localhost kernel: ftrace: allocated 193 pages with 3 groups
Dec 02 22:51:51 localhost kernel: Dynamic Preempt: voluntary
Dec 02 22:51:51 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 02 22:51:51 localhost kernel: rcu:         RCU event tracing is enabled.
Dec 02 22:51:51 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 02 22:51:51 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 02 22:51:51 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 02 22:51:51 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 02 22:51:51 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 02 22:51:51 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 02 22:51:51 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 02 22:51:51 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 02 22:51:51 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 02 22:51:51 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 02 22:51:51 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 02 22:51:51 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 02 22:51:51 localhost kernel: Console: colour VGA+ 80x25
Dec 02 22:51:51 localhost kernel: printk: console [ttyS0] enabled
Dec 02 22:51:51 localhost kernel: ACPI: Core revision 20230331
Dec 02 22:51:51 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 02 22:51:51 localhost kernel: x2apic enabled
Dec 02 22:51:51 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Dec 02 22:51:51 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 02 22:51:51 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Dec 02 22:51:51 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 02 22:51:51 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 02 22:51:51 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 02 22:51:51 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 02 22:51:51 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 02 22:51:51 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec 02 22:51:51 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 02 22:51:51 localhost kernel: RETBleed: Mitigation: untrained return thunk
Dec 02 22:51:51 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 02 22:51:51 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 02 22:51:51 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec 02 22:51:51 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec 02 22:51:51 localhost kernel: x86/bugs: return thunk changed
Dec 02 22:51:51 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec 02 22:51:51 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 02 22:51:51 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 02 22:51:51 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 02 22:51:51 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 02 22:51:51 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec 02 22:51:51 localhost kernel: Freeing SMP alternatives memory: 40K
Dec 02 22:51:51 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 02 22:51:51 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec 02 22:51:51 localhost kernel: landlock: Up and running.
Dec 02 22:51:51 localhost kernel: Yama: becoming mindful.
Dec 02 22:51:51 localhost kernel: SELinux:  Initializing.
Dec 02 22:51:51 localhost kernel: LSM support for eBPF active
Dec 02 22:51:51 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 02 22:51:51 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 02 22:51:51 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 02 22:51:51 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 02 22:51:51 localhost kernel: ... version:                0
Dec 02 22:51:51 localhost kernel: ... bit width:              48
Dec 02 22:51:51 localhost kernel: ... generic registers:      6
Dec 02 22:51:51 localhost kernel: ... value mask:             0000ffffffffffff
Dec 02 22:51:51 localhost kernel: ... max period:             00007fffffffffff
Dec 02 22:51:51 localhost kernel: ... fixed-purpose events:   0
Dec 02 22:51:51 localhost kernel: ... event mask:             000000000000003f
Dec 02 22:51:51 localhost kernel: signal: max sigframe size: 1776
Dec 02 22:51:51 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 02 22:51:51 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 02 22:51:51 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 02 22:51:51 localhost kernel: smpboot: x86: Booting SMP configuration:
Dec 02 22:51:51 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 02 22:51:51 localhost kernel: smp: Brought up 1 node, 8 CPUs
Dec 02 22:51:51 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Dec 02 22:51:51 localhost kernel: node 0 deferred pages initialised in 8ms
Dec 02 22:51:51 localhost kernel: Memory: 7763900K/8388068K available (16384K kernel code, 5795K rwdata, 13908K rodata, 4196K init, 7156K bss, 618208K reserved, 0K cma-reserved)
Dec 02 22:51:51 localhost kernel: devtmpfs: initialized
Dec 02 22:51:51 localhost kernel: x86/mm: Memory block size: 128MB
Dec 02 22:51:51 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 02 22:51:51 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec 02 22:51:51 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 02 22:51:51 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 02 22:51:51 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec 02 22:51:51 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 02 22:51:51 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 02 22:51:51 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 02 22:51:51 localhost kernel: audit: type=2000 audit(1764715909.270:1): state=initialized audit_enabled=0 res=1
Dec 02 22:51:51 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 02 22:51:51 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 02 22:51:51 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 02 22:51:51 localhost kernel: cpuidle: using governor menu
Dec 02 22:51:51 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 02 22:51:51 localhost kernel: PCI: Using configuration type 1 for base access
Dec 02 22:51:51 localhost kernel: PCI: Using configuration type 1 for extended access
Dec 02 22:51:51 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 02 22:51:51 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec 02 22:51:51 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec 02 22:51:51 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec 02 22:51:51 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec 02 22:51:51 localhost kernel: Demotion targets for Node 0: null
Dec 02 22:51:51 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 02 22:51:51 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 02 22:51:51 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 02 22:51:51 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 02 22:51:51 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 02 22:51:51 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 02 22:51:51 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec 02 22:51:51 localhost kernel: ACPI: Interpreter enabled
Dec 02 22:51:51 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 02 22:51:51 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 02 22:51:51 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 02 22:51:51 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 02 22:51:51 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 02 22:51:51 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 02 22:51:51 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [3] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [4] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [5] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [6] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [7] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [8] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [9] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [10] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [11] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [12] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [13] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [14] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [15] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [16] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [17] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [18] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [19] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [20] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [21] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [22] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [23] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [24] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [25] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [26] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [27] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [28] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [29] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [30] registered
Dec 02 22:51:51 localhost kernel: acpiphp: Slot [31] registered
Dec 02 22:51:51 localhost kernel: PCI host bridge to bus 0000:00
Dec 02 22:51:51 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 02 22:51:51 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 02 22:51:51 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 02 22:51:51 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 02 22:51:51 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec 02 22:51:51 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 02 22:51:51 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 02 22:51:51 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec 02 22:51:51 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec 02 22:51:51 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 02 22:51:51 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec 02 22:51:51 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec 02 22:51:51 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 02 22:51:51 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 02 22:51:51 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec 02 22:51:51 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec 02 22:51:51 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 02 22:51:51 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec 02 22:51:51 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec 02 22:51:51 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec 02 22:51:51 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec 02 22:51:51 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 02 22:51:51 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec 02 22:51:51 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec 02 22:51:51 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 02 22:51:51 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec 02 22:51:51 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec 02 22:51:51 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 02 22:51:51 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 02 22:51:51 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 02 22:51:51 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 02 22:51:51 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 02 22:51:51 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 02 22:51:51 localhost kernel: iommu: Default domain type: Translated
Dec 02 22:51:51 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec 02 22:51:51 localhost kernel: SCSI subsystem initialized
Dec 02 22:51:51 localhost kernel: ACPI: bus type USB registered
Dec 02 22:51:51 localhost kernel: usbcore: registered new interface driver usbfs
Dec 02 22:51:51 localhost kernel: usbcore: registered new interface driver hub
Dec 02 22:51:51 localhost kernel: usbcore: registered new device driver usb
Dec 02 22:51:51 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 02 22:51:51 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 02 22:51:51 localhost kernel: PTP clock support registered
Dec 02 22:51:51 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 02 22:51:51 localhost kernel: NetLabel: Initializing
Dec 02 22:51:51 localhost kernel: NetLabel:  domain hash size = 128
Dec 02 22:51:51 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 02 22:51:51 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 02 22:51:51 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 02 22:51:51 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 02 22:51:51 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 02 22:51:51 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Dec 02 22:51:51 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 02 22:51:51 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 02 22:51:51 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 02 22:51:51 localhost kernel: vgaarb: loaded
Dec 02 22:51:51 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 02 22:51:51 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 02 22:51:51 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 02 22:51:51 localhost kernel: pnp: PnP ACPI init
Dec 02 22:51:51 localhost kernel: pnp 00:03: [dma 2]
Dec 02 22:51:51 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 02 22:51:51 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 02 22:51:51 localhost kernel: NET: Registered PF_INET protocol family
Dec 02 22:51:51 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 02 22:51:51 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec 02 22:51:51 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 02 22:51:51 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec 02 22:51:51 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 02 22:51:51 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec 02 22:51:51 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec 02 22:51:51 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 02 22:51:51 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 02 22:51:51 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 02 22:51:51 localhost kernel: NET: Registered PF_XDP protocol family
Dec 02 22:51:51 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 02 22:51:51 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 02 22:51:51 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 02 22:51:51 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 02 22:51:51 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 02 22:51:51 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 02 22:51:51 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 02 22:51:51 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 87507 usecs
Dec 02 22:51:51 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 02 22:51:51 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 02 22:51:51 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 02 22:51:51 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 02 22:51:51 localhost kernel: ACPI: bus type thunderbolt registered
Dec 02 22:51:51 localhost kernel: Initialise system trusted keyrings
Dec 02 22:51:51 localhost kernel: Key type blacklist registered
Dec 02 22:51:51 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec 02 22:51:51 localhost kernel: zbud: loaded
Dec 02 22:51:51 localhost kernel: integrity: Platform Keyring initialized
Dec 02 22:51:51 localhost kernel: integrity: Machine keyring initialized
Dec 02 22:51:51 localhost kernel: Freeing initrd memory: 87804K
Dec 02 22:51:51 localhost kernel: NET: Registered PF_ALG protocol family
Dec 02 22:51:51 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 02 22:51:51 localhost kernel: Key type asymmetric registered
Dec 02 22:51:51 localhost kernel: Asymmetric key parser 'x509' registered
Dec 02 22:51:51 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 02 22:51:51 localhost kernel: io scheduler mq-deadline registered
Dec 02 22:51:51 localhost kernel: io scheduler kyber registered
Dec 02 22:51:51 localhost kernel: io scheduler bfq registered
Dec 02 22:51:51 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 02 22:51:51 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 02 22:51:51 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 02 22:51:51 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 02 22:51:51 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 02 22:51:51 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 02 22:51:51 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 02 22:51:51 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 02 22:51:51 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 02 22:51:51 localhost kernel: Non-volatile memory driver v1.3
Dec 02 22:51:51 localhost kernel: rdac: device handler registered
Dec 02 22:51:51 localhost kernel: hp_sw: device handler registered
Dec 02 22:51:51 localhost kernel: emc: device handler registered
Dec 02 22:51:51 localhost kernel: alua: device handler registered
Dec 02 22:51:51 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 02 22:51:51 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 02 22:51:51 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 02 22:51:51 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 02 22:51:51 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 02 22:51:51 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 02 22:51:51 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 02 22:51:51 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-645.el9.x86_64 uhci_hcd
Dec 02 22:51:51 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 02 22:51:51 localhost kernel: hub 1-0:1.0: USB hub found
Dec 02 22:51:51 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 02 22:51:51 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 02 22:51:51 localhost kernel: usbserial: USB Serial support registered for generic
Dec 02 22:51:51 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 02 22:51:51 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 02 22:51:51 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 02 22:51:51 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 02 22:51:51 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 02 22:51:51 localhost kernel: rtc_cmos 00:04: registered as rtc0
Dec 02 22:51:51 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-02T22:51:50 UTC (1764715910)
Dec 02 22:51:51 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 02 22:51:51 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 02 22:51:51 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec 02 22:51:51 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 02 22:51:51 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 02 22:51:51 localhost kernel: usbcore: registered new interface driver usbhid
Dec 02 22:51:51 localhost kernel: usbhid: USB HID core driver
Dec 02 22:51:51 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 02 22:51:51 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 02 22:51:51 localhost kernel: Initializing XFRM netlink socket
Dec 02 22:51:51 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 02 22:51:51 localhost kernel: Segment Routing with IPv6
Dec 02 22:51:51 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 02 22:51:51 localhost kernel: mpls_gso: MPLS GSO support
Dec 02 22:51:51 localhost kernel: IPI shorthand broadcast: enabled
Dec 02 22:51:51 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 02 22:51:51 localhost kernel: AES CTR mode by8 optimization enabled
Dec 02 22:51:51 localhost kernel: sched_clock: Marking stable (1255001780, 141257900)->(1526651629, -130391949)
Dec 02 22:51:51 localhost kernel: registered taskstats version 1
Dec 02 22:51:51 localhost kernel: Loading compiled-in X.509 certificates
Dec 02 22:51:51 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 02 22:51:51 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 02 22:51:51 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 02 22:51:51 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec 02 22:51:51 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec 02 22:51:51 localhost kernel: Demotion targets for Node 0: null
Dec 02 22:51:51 localhost kernel: page_owner is disabled
Dec 02 22:51:51 localhost kernel: Key type .fscrypt registered
Dec 02 22:51:51 localhost kernel: Key type fscrypt-provisioning registered
Dec 02 22:51:51 localhost kernel: Key type big_key registered
Dec 02 22:51:51 localhost kernel: Key type encrypted registered
Dec 02 22:51:51 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 02 22:51:51 localhost kernel: Loading compiled-in module X.509 certificates
Dec 02 22:51:51 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 02 22:51:51 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 02 22:51:51 localhost kernel: ima: No architecture policies found
Dec 02 22:51:51 localhost kernel: evm: Initialising EVM extended attributes:
Dec 02 22:51:51 localhost kernel: evm: security.selinux
Dec 02 22:51:51 localhost kernel: evm: security.SMACK64 (disabled)
Dec 02 22:51:51 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 02 22:51:51 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 02 22:51:51 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 02 22:51:51 localhost kernel: evm: security.apparmor (disabled)
Dec 02 22:51:51 localhost kernel: evm: security.ima
Dec 02 22:51:51 localhost kernel: evm: security.capability
Dec 02 22:51:51 localhost kernel: evm: HMAC attrs: 0x1
Dec 02 22:51:51 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 02 22:51:51 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 02 22:51:51 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 02 22:51:51 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 02 22:51:51 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 02 22:51:51 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 02 22:51:51 localhost kernel: Running certificate verification RSA selftest
Dec 02 22:51:51 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 02 22:51:51 localhost kernel: Running certificate verification ECDSA selftest
Dec 02 22:51:51 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec 02 22:51:51 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 02 22:51:51 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 02 22:51:51 localhost kernel: clk: Disabling unused clocks
Dec 02 22:51:51 localhost kernel: Freeing unused decrypted memory: 2028K
Dec 02 22:51:51 localhost kernel: Freeing unused kernel image (initmem) memory: 4196K
Dec 02 22:51:51 localhost kernel: Write protecting the kernel read-only data: 30720k
Dec 02 22:51:51 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Dec 02 22:51:51 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 02 22:51:51 localhost kernel: Run /init as init process
Dec 02 22:51:51 localhost kernel:   with arguments:
Dec 02 22:51:51 localhost kernel:     /init
Dec 02 22:51:51 localhost kernel:   with environment:
Dec 02 22:51:51 localhost kernel:     HOME=/
Dec 02 22:51:51 localhost kernel:     TERM=linux
Dec 02 22:51:51 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64
Dec 02 22:51:51 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 02 22:51:51 localhost systemd[1]: Detected virtualization kvm.
Dec 02 22:51:51 localhost systemd[1]: Detected architecture x86-64.
Dec 02 22:51:51 localhost systemd[1]: Running in initrd.
Dec 02 22:51:51 localhost systemd[1]: No hostname configured, using default hostname.
Dec 02 22:51:51 localhost systemd[1]: Hostname set to <localhost>.
Dec 02 22:51:51 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 02 22:51:51 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 02 22:51:51 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 02 22:51:51 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 02 22:51:51 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 02 22:51:51 localhost systemd[1]: Reached target Local File Systems.
Dec 02 22:51:51 localhost systemd[1]: Reached target Path Units.
Dec 02 22:51:51 localhost systemd[1]: Reached target Slice Units.
Dec 02 22:51:51 localhost systemd[1]: Reached target Swaps.
Dec 02 22:51:51 localhost systemd[1]: Reached target Timer Units.
Dec 02 22:51:51 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 02 22:51:51 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 02 22:51:51 localhost systemd[1]: Listening on Journal Socket.
Dec 02 22:51:51 localhost systemd[1]: Listening on udev Control Socket.
Dec 02 22:51:51 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 02 22:51:51 localhost systemd[1]: Reached target Socket Units.
Dec 02 22:51:51 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 02 22:51:51 localhost systemd[1]: Starting Journal Service...
Dec 02 22:51:51 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 02 22:51:51 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 02 22:51:51 localhost systemd[1]: Starting Create System Users...
Dec 02 22:51:51 localhost systemd[1]: Starting Setup Virtual Console...
Dec 02 22:51:51 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 02 22:51:51 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 02 22:51:51 localhost systemd[1]: Finished Create System Users.
Dec 02 22:51:51 localhost systemd-journald[307]: Journal started
Dec 02 22:51:51 localhost systemd-journald[307]: Runtime Journal (/run/log/journal/b18b06c3ce104c45b42d5391c38cc9dc) is 8.0M, max 153.6M, 145.6M free.
Dec 02 22:51:51 localhost systemd-sysusers[312]: Creating group 'users' with GID 100.
Dec 02 22:51:51 localhost systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Dec 02 22:51:51 localhost systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 02 22:51:51 localhost systemd[1]: Started Journal Service.
Dec 02 22:51:51 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 02 22:51:51 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 02 22:51:51 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 02 22:51:51 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 02 22:51:51 localhost systemd[1]: Finished Setup Virtual Console.
Dec 02 22:51:51 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 02 22:51:51 localhost systemd[1]: Starting dracut cmdline hook...
Dec 02 22:51:51 localhost dracut-cmdline[326]: dracut-9 dracut-057-102.git20250818.el9
Dec 02 22:51:51 localhost dracut-cmdline[326]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 02 22:51:51 localhost systemd[1]: Finished dracut cmdline hook.
Dec 02 22:51:51 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 02 22:51:51 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 02 22:51:51 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 02 22:51:51 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec 02 22:51:51 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 02 22:51:51 localhost kernel: RPC: Registered udp transport module.
Dec 02 22:51:51 localhost kernel: RPC: Registered tcp transport module.
Dec 02 22:51:51 localhost kernel: RPC: Registered tcp-with-tls transport module.
Dec 02 22:51:51 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 02 22:51:52 localhost rpc.statd[443]: Version 2.5.4 starting
Dec 02 22:51:52 localhost rpc.statd[443]: Initializing NSM state
Dec 02 22:51:52 localhost rpc.idmapd[448]: Setting log level to 0
Dec 02 22:51:52 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 02 22:51:52 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 02 22:51:52 localhost systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Dec 02 22:51:52 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 02 22:51:52 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 02 22:51:52 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 02 22:51:52 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 02 22:51:52 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 02 22:51:52 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 02 22:51:52 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 02 22:51:52 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 02 22:51:52 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 02 22:51:52 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 02 22:51:52 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 02 22:51:52 localhost systemd[1]: Reached target Network.
Dec 02 22:51:52 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 02 22:51:52 localhost systemd[1]: Starting dracut initqueue hook...
Dec 02 22:51:52 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 02 22:51:52 localhost systemd[1]: Reached target System Initialization.
Dec 02 22:51:52 localhost systemd[1]: Reached target Basic System.
Dec 02 22:51:52 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec 02 22:51:52 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec 02 22:51:52 localhost kernel:  vda: vda1
Dec 02 22:51:52 localhost kernel: libata version 3.00 loaded.
Dec 02 22:51:52 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Dec 02 22:51:52 localhost kernel: scsi host0: ata_piix
Dec 02 22:51:52 localhost kernel: scsi host1: ata_piix
Dec 02 22:51:52 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec 02 22:51:52 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec 02 22:51:52 localhost systemd[1]: Found device /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 02 22:51:52 localhost systemd[1]: Reached target Initrd Root Device.
Dec 02 22:51:52 localhost kernel: ata1: found unknown device (class 0)
Dec 02 22:51:52 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 02 22:51:52 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 02 22:51:52 localhost systemd-udevd[510]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 22:51:52 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 02 22:51:52 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 02 22:51:52 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 02 22:51:52 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 02 22:51:52 localhost systemd[1]: Finished dracut initqueue hook.
Dec 02 22:51:52 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 02 22:51:52 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 02 22:51:52 localhost systemd[1]: Reached target Remote File Systems.
Dec 02 22:51:52 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 02 22:51:52 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 02 22:51:52 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f...
Dec 02 22:51:52 localhost systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Dec 02 22:51:52 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 02 22:51:52 localhost systemd[1]: Mounting /sysroot...
Dec 02 22:51:53 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 02 22:51:53 localhost kernel: XFS (vda1): Mounting V5 Filesystem fcf6b761-831a-48a7-9f5f-068b5063763f
Dec 02 22:51:53 localhost kernel: XFS (vda1): Ending clean mount
Dec 02 22:51:53 localhost systemd[1]: Mounted /sysroot.
Dec 02 22:51:53 localhost systemd[1]: Reached target Initrd Root File System.
Dec 02 22:51:53 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 02 22:51:53 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 02 22:51:53 localhost systemd[1]: Reached target Initrd File Systems.
Dec 02 22:51:53 localhost systemd[1]: Reached target Initrd Default Target.
Dec 02 22:51:53 localhost systemd[1]: Starting dracut mount hook...
Dec 02 22:51:53 localhost systemd[1]: Finished dracut mount hook.
Dec 02 22:51:53 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 02 22:51:53 localhost rpc.idmapd[448]: exiting on signal 15
Dec 02 22:51:53 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 02 22:51:53 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 02 22:51:53 localhost systemd[1]: Stopped target Network.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Timer Units.
Dec 02 22:51:53 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 02 22:51:53 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Basic System.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Path Units.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Remote File Systems.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Slice Units.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Socket Units.
Dec 02 22:51:53 localhost systemd[1]: Stopped target System Initialization.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Local File Systems.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Swaps.
Dec 02 22:51:53 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped dracut mount hook.
Dec 02 22:51:53 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 02 22:51:53 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 02 22:51:53 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 02 22:51:53 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 02 22:51:53 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 02 22:51:53 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 02 22:51:53 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 02 22:51:53 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 02 22:51:53 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 02 22:51:53 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 02 22:51:53 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 02 22:51:53 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 02 22:51:53 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Closed udev Control Socket.
Dec 02 22:51:53 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Closed udev Kernel Socket.
Dec 02 22:51:53 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 02 22:51:53 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 02 22:51:53 localhost systemd[1]: Starting Cleanup udev Database...
Dec 02 22:51:53 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 02 22:51:53 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 02 22:51:53 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Stopped Create System Users.
Dec 02 22:51:53 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 02 22:51:53 localhost systemd[1]: Finished Cleanup udev Database.
Dec 02 22:51:53 localhost systemd[1]: Reached target Switch Root.
Dec 02 22:51:53 localhost systemd[1]: Starting Switch Root...
Dec 02 22:51:53 localhost systemd[1]: Switching root.
Dec 02 22:51:53 localhost systemd-journald[307]: Received SIGTERM from PID 1 (systemd).
Dec 02 22:51:53 localhost systemd-journald[307]: Journal stopped
Dec 02 22:51:54 localhost kernel: audit: type=1404 audit(1764715913.883:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 02 22:51:54 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 22:51:54 localhost kernel: SELinux:  policy capability open_perms=1
Dec 02 22:51:54 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 22:51:54 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 02 22:51:54 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 22:51:54 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 22:51:54 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 22:51:54 localhost kernel: audit: type=1403 audit(1764715914.005:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 02 22:51:54 localhost systemd[1]: Successfully loaded SELinux policy in 125.137ms.
Dec 02 22:51:54 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 25.113ms.
Dec 02 22:51:54 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 02 22:51:54 localhost systemd[1]: Detected virtualization kvm.
Dec 02 22:51:54 localhost systemd[1]: Detected architecture x86-64.
Dec 02 22:51:54 localhost systemd-rc-local-generator[639]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 22:51:54 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 02 22:51:54 localhost systemd[1]: Stopped Switch Root.
Dec 02 22:51:54 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 02 22:51:54 localhost systemd[1]: Created slice Slice /system/getty.
Dec 02 22:51:54 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 02 22:51:54 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 02 22:51:54 localhost systemd[1]: Created slice User and Session Slice.
Dec 02 22:51:54 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 02 22:51:54 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 02 22:51:54 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 02 22:51:54 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 02 22:51:54 localhost systemd[1]: Stopped target Switch Root.
Dec 02 22:51:54 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 02 22:51:54 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 02 22:51:54 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 02 22:51:54 localhost systemd[1]: Reached target Path Units.
Dec 02 22:51:54 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 02 22:51:54 localhost systemd[1]: Reached target Slice Units.
Dec 02 22:51:54 localhost systemd[1]: Reached target Swaps.
Dec 02 22:51:54 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 02 22:51:54 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 02 22:51:54 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 02 22:51:54 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 02 22:51:54 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 02 22:51:54 localhost systemd[1]: Listening on udev Control Socket.
Dec 02 22:51:54 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 02 22:51:54 localhost systemd[1]: Mounting Huge Pages File System...
Dec 02 22:51:54 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 02 22:51:54 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 02 22:51:54 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 02 22:51:54 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 02 22:51:54 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 02 22:51:54 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 02 22:51:54 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 02 22:51:54 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Dec 02 22:51:54 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 02 22:51:54 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 02 22:51:54 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 02 22:51:54 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 02 22:51:54 localhost systemd[1]: Stopped Journal Service.
Dec 02 22:51:54 localhost systemd[1]: Starting Journal Service...
Dec 02 22:51:54 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 02 22:51:54 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 02 22:51:54 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 02 22:51:54 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 02 22:51:54 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 02 22:51:54 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 02 22:51:54 localhost systemd-journald[680]: Journal started
Dec 02 22:51:54 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 02 22:51:54 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 02 22:51:54 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 02 22:51:54 localhost kernel: fuse: init (API version 7.37)
Dec 02 22:51:54 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 02 22:51:54 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 02 22:51:54 localhost systemd[1]: Started Journal Service.
Dec 02 22:51:54 localhost systemd[1]: Mounted Huge Pages File System.
Dec 02 22:51:54 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 02 22:51:54 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 02 22:51:54 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 02 22:51:54 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 02 22:51:54 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 02 22:51:54 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 02 22:51:54 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec 02 22:51:54 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Dec 02 22:51:54 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 02 22:51:54 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 02 22:51:54 localhost kernel: ACPI: bus type drm_connector registered
Dec 02 22:51:54 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 02 22:51:54 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 02 22:51:54 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 02 22:51:54 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 02 22:51:54 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 02 22:51:54 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 02 22:51:54 localhost systemd[1]: Mounting FUSE Control File System...
Dec 02 22:51:54 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 02 22:51:54 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 02 22:51:54 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 02 22:51:54 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec 02 22:51:54 localhost systemd[1]: Starting Load/Save OS Random Seed...
Dec 02 22:51:54 localhost systemd[1]: Starting Create System Users...
Dec 02 22:51:54 localhost systemd[1]: Mounted FUSE Control File System.
Dec 02 22:51:54 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 02 22:51:54 localhost systemd-journald[680]: Received client request to flush runtime journal.
Dec 02 22:51:54 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 02 22:51:54 localhost systemd[1]: Finished Load/Save OS Random Seed.
Dec 02 22:51:54 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 02 22:51:54 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 02 22:51:54 localhost systemd[1]: Finished Create System Users.
Dec 02 22:51:54 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 02 22:51:54 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 02 22:51:54 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 02 22:51:54 localhost systemd[1]: Reached target Local File Systems.
Dec 02 22:51:54 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 02 22:51:54 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 02 22:51:54 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 02 22:51:54 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec 02 22:51:54 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 02 22:51:54 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 02 22:51:54 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 02 22:51:54 localhost bootctl[698]: Couldn't find EFI system partition, skipping.
Dec 02 22:51:54 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 02 22:51:54 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 02 22:51:54 localhost systemd[1]: Starting Security Auditing Service...
Dec 02 22:51:54 localhost systemd[1]: Starting RPC Bind...
Dec 02 22:51:54 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 02 22:51:54 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 02 22:51:54 localhost auditd[704]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec 02 22:51:54 localhost auditd[704]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec 02 22:51:54 localhost systemd[1]: Started RPC Bind.
Dec 02 22:51:54 localhost augenrules[709]: /sbin/augenrules: No change
Dec 02 22:51:54 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 02 22:51:54 localhost augenrules[724]: No rules
Dec 02 22:51:54 localhost augenrules[724]: enabled 1
Dec 02 22:51:54 localhost augenrules[724]: failure 1
Dec 02 22:51:54 localhost augenrules[724]: pid 704
Dec 02 22:51:54 localhost augenrules[724]: rate_limit 0
Dec 02 22:51:54 localhost augenrules[724]: backlog_limit 8192
Dec 02 22:51:54 localhost augenrules[724]: lost 0
Dec 02 22:51:54 localhost augenrules[724]: backlog 0
Dec 02 22:51:54 localhost augenrules[724]: backlog_wait_time 60000
Dec 02 22:51:54 localhost augenrules[724]: backlog_wait_time_actual 0
Dec 02 22:51:54 localhost augenrules[724]: enabled 1
Dec 02 22:51:54 localhost augenrules[724]: failure 1
Dec 02 22:51:54 localhost augenrules[724]: pid 704
Dec 02 22:51:54 localhost augenrules[724]: rate_limit 0
Dec 02 22:51:54 localhost augenrules[724]: backlog_limit 8192
Dec 02 22:51:54 localhost augenrules[724]: lost 0
Dec 02 22:51:54 localhost augenrules[724]: backlog 0
Dec 02 22:51:54 localhost augenrules[724]: backlog_wait_time 60000
Dec 02 22:51:54 localhost augenrules[724]: backlog_wait_time_actual 0
Dec 02 22:51:54 localhost augenrules[724]: enabled 1
Dec 02 22:51:54 localhost augenrules[724]: failure 1
Dec 02 22:51:54 localhost augenrules[724]: pid 704
Dec 02 22:51:54 localhost augenrules[724]: rate_limit 0
Dec 02 22:51:54 localhost augenrules[724]: backlog_limit 8192
Dec 02 22:51:54 localhost augenrules[724]: lost 0
Dec 02 22:51:54 localhost augenrules[724]: backlog 0
Dec 02 22:51:54 localhost augenrules[724]: backlog_wait_time 60000
Dec 02 22:51:54 localhost augenrules[724]: backlog_wait_time_actual 0
Dec 02 22:51:54 localhost systemd[1]: Started Security Auditing Service.
Dec 02 22:51:54 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 02 22:51:54 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 02 22:51:55 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 02 22:51:55 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 02 22:51:55 localhost systemd[1]: Starting Update is Completed...
Dec 02 22:51:55 localhost systemd[1]: Finished Update is Completed.
Dec 02 22:51:55 localhost systemd-udevd[732]: Using default interface naming scheme 'rhel-9.0'.
Dec 02 22:51:55 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 02 22:51:55 localhost systemd[1]: Reached target System Initialization.
Dec 02 22:51:55 localhost systemd[1]: Started dnf makecache --timer.
Dec 02 22:51:55 localhost systemd[1]: Started Daily rotation of log files.
Dec 02 22:51:55 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 02 22:51:55 localhost systemd[1]: Reached target Timer Units.
Dec 02 22:51:55 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 02 22:51:55 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 02 22:51:55 localhost systemd[1]: Reached target Socket Units.
Dec 02 22:51:55 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 02 22:51:55 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 02 22:51:55 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 02 22:51:55 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 02 22:51:55 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 02 22:51:55 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 02 22:51:55 localhost systemd-udevd[746]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 22:51:55 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 02 22:51:55 localhost systemd[1]: Reached target Basic System.
Dec 02 22:51:55 localhost dbus-broker-lau[764]: Ready
Dec 02 22:51:55 localhost systemd[1]: Starting NTP client/server...
Dec 02 22:51:55 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec 02 22:51:55 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 02 22:51:55 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec 02 22:51:55 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec 02 22:51:55 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 02 22:51:55 localhost chronyd[785]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 02 22:51:55 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 02 22:51:55 localhost chronyd[785]: Loaded 0 symmetric keys
Dec 02 22:51:55 localhost systemd[1]: Starting IPv4 firewall with iptables...
Dec 02 22:51:55 localhost chronyd[785]: Using right/UTC timezone to obtain leap second data
Dec 02 22:51:55 localhost systemd[1]: Started irqbalance daemon.
Dec 02 22:51:55 localhost chronyd[785]: Loaded seccomp filter (level 2)
Dec 02 22:51:55 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 02 22:51:55 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 22:51:55 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 22:51:55 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 22:51:55 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 02 22:51:55 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 02 22:51:55 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 02 22:51:55 localhost systemd[1]: Starting User Login Management...
Dec 02 22:51:55 localhost systemd[1]: Started NTP client/server.
Dec 02 22:51:55 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 02 22:51:55 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 02 22:51:55 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 02 22:51:55 localhost systemd-logind[795]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 02 22:51:55 localhost systemd-logind[795]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 02 22:51:55 localhost kernel: Console: switching to colour dummy device 80x25
Dec 02 22:51:55 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 02 22:51:55 localhost kernel: [drm] features: -context_init
Dec 02 22:51:55 localhost kernel: [drm] number of scanouts: 1
Dec 02 22:51:55 localhost kernel: [drm] number of cap sets: 0
Dec 02 22:51:55 localhost systemd-logind[795]: New seat seat0.
Dec 02 22:51:55 localhost systemd[1]: Started User Login Management.
Dec 02 22:51:55 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec 02 22:51:55 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec 02 22:51:55 localhost kernel: Console: switching to colour frame buffer device 128x48
Dec 02 22:51:55 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 02 22:51:55 localhost kernel: kvm_amd: TSC scaling supported
Dec 02 22:51:55 localhost kernel: kvm_amd: Nested Virtualization enabled
Dec 02 22:51:55 localhost kernel: kvm_amd: Nested Paging enabled
Dec 02 22:51:55 localhost kernel: kvm_amd: LBR virtualization supported
Dec 02 22:51:55 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 02 22:51:55 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec 02 22:51:55 localhost iptables.init[789]: iptables: Applying firewall rules: [  OK  ]
Dec 02 22:51:55 localhost systemd[1]: Finished IPv4 firewall with iptables.
Dec 02 22:51:55 localhost cloud-init[842]: Cloud-init v. 24.4-7.el9 running 'init-local' at Tue, 02 Dec 2025 22:51:55 +0000. Up 6.54 seconds.
Dec 02 22:51:56 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 02 22:51:56 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 02 22:51:56 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpk5n5eh1l.mount: Deactivated successfully.
Dec 02 22:51:56 localhost systemd[1]: Starting Hostname Service...
Dec 02 22:51:56 localhost systemd[1]: Started Hostname Service.
Dec 02 22:51:56 np0005542927.novalocal systemd-hostnamed[856]: Hostname set to <np0005542927.novalocal> (static)
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Reached target Preparation for Network.
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Starting Network Manager...
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.4648] NetworkManager (version 1.54.1-1.el9) is starting... (boot:13ba9be2-a183-422c-a29d-1c0aec36730d)
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.4654] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.4766] manager[0x556663d96080]: monitoring kernel firmware directory '/lib/firmware'.
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.4823] hostname: hostname: using hostnamed
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.4824] hostname: static hostname changed from (none) to "np0005542927.novalocal"
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.4833] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.4981] manager[0x556663d96080]: rfkill: Wi-Fi hardware radio set enabled
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.4983] manager[0x556663d96080]: rfkill: WWAN hardware radio set enabled
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5043] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5044] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5045] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5046] manager: Networking is enabled by state file
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5049] settings: Loaded settings plugin: keyfile (internal)
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5065] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5097] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5118] dhcp: init: Using DHCP client 'internal'
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5122] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5146] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5157] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5166] device (lo): Activation: starting connection 'lo' (c5650cb8-9795-426f-8b6b-42dce70d6cce)
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5179] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5184] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5224] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5229] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5232] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5234] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5237] device (eth0): carrier: link connected
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5241] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5248] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5254] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5259] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5260] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5264] manager: NetworkManager state is now CONNECTING
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5266] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5273] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5278] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Started Network Manager.
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5337] dhcp4 (eth0): state changed new lease, address=38.102.83.77
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5345] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Reached target Network.
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5367] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5614] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5618] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5621] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5634] device (lo): Activation: successful, device activated.
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5660] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5679] manager: NetworkManager state is now CONNECTED_SITE
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5693] device (eth0): Activation: successful, device activated.
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5710] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 02 22:51:56 np0005542927.novalocal NetworkManager[860]: <info>  [1764715916.5722] manager: startup complete
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Reached target NFS client services.
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Reached target Remote File Systems.
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 02 22:51:56 np0005542927.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: Cloud-init v. 24.4-7.el9 running 'init' at Tue, 02 Dec 2025 22:51:56 +0000. Up 7.60 seconds.
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: |  eth0  | True |         38.102.83.77         | 255.255.255.0 | global | fa:16:3e:ed:c0:1a |
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:feed:c01a/64 |       .       |  link  | fa:16:3e:ed:c0:1a |
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 02 22:51:56 np0005542927.novalocal cloud-init[920]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec 02 22:51:57 np0005542927.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 02 22:51:57 np0005542927.novalocal useradd[986]: new group: name=cloud-user, GID=1001
Dec 02 22:51:57 np0005542927.novalocal useradd[986]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 02 22:51:57 np0005542927.novalocal useradd[986]: add 'cloud-user' to group 'adm'
Dec 02 22:51:57 np0005542927.novalocal useradd[986]: add 'cloud-user' to group 'systemd-journal'
Dec 02 22:51:57 np0005542927.novalocal useradd[986]: add 'cloud-user' to shadow group 'adm'
Dec 02 22:51:57 np0005542927.novalocal useradd[986]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: Generating public/private rsa key pair.
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: The key fingerprint is:
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: SHA256:wVdZIJSaXVB5m/BuTEn6JtCiWRL+X80UWlGx5e2496k root@np0005542927.novalocal
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: The key's randomart image is:
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: +---[RSA 3072]----+
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |         .+++=.o*|
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |       .. .o= o++|
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |       .o=.o *o=+|
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |        =o= o.*o.|
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |        S* o =.+.|
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |        o . . B.o|
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |           . =. .|
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |            .  .o|
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |             E...|
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: +----[SHA256]-----+
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: Generating public/private ecdsa key pair.
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: The key fingerprint is:
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: SHA256:qfewvzqEBkVHh2/hOi2TgqNMwEdGJSanSIdhW7evX/4 root@np0005542927.novalocal
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: The key's randomart image is:
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: +---[ECDSA 256]---+
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: | =+B.+..o..      |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |+.Ooo o....      |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |ooo  o   o .     |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |.. .. .  .+      |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: | ..  o oS=       |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |  . o =.B .      |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: | o . +.oo=       |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |  o   ..++       |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |       .o==E     |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: +----[SHA256]-----+
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: Generating public/private ed25519 key pair.
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: The key fingerprint is:
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: SHA256:uvLJi7lrtXN2aS8sBBtLcAkp72k1t47OvvQS/RfRHP8 root@np0005542927.novalocal
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: The key's randomart image is:
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: +--[ED25519 256]--+
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |    .o .         |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |  . o o       .  |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |   o o       o o |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |    . * .   . o .|
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |   . + OS.   .  .|
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |    + =.+   .   E|
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |   . .o* o . .   |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |    o*=+= B .    |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: |   .=B@B.+ +.    |
Dec 02 22:51:58 np0005542927.novalocal cloud-init[920]: +----[SHA256]-----+
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Reached target Cloud-config availability.
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Reached target Network is Online.
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Starting Crash recovery kernel arming...
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Starting System Logging Service...
Dec 02 22:51:58 np0005542927.novalocal sm-notify[1002]: Version 2.5.4 starting
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Starting OpenSSH server daemon...
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Starting Permit User Sessions...
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Started Notify NFS peers of a restart.
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Finished Permit User Sessions.
Dec 02 22:51:58 np0005542927.novalocal sshd[1004]: Server listening on 0.0.0.0 port 22.
Dec 02 22:51:58 np0005542927.novalocal sshd[1004]: Server listening on :: port 22.
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Started Command Scheduler.
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Started Getty on tty1.
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Started Serial Getty on ttyS0.
Dec 02 22:51:58 np0005542927.novalocal crond[1007]: (CRON) STARTUP (1.5.7)
Dec 02 22:51:58 np0005542927.novalocal crond[1007]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 02 22:51:58 np0005542927.novalocal crond[1007]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 73% if used.)
Dec 02 22:51:58 np0005542927.novalocal crond[1007]: (CRON) INFO (running with inotify support)
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Reached target Login Prompts.
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Started OpenSSH server daemon.
Dec 02 22:51:58 np0005542927.novalocal rsyslogd[1003]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1003" x-info="https://www.rsyslog.com"] start
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Started System Logging Service.
Dec 02 22:51:58 np0005542927.novalocal rsyslogd[1003]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Reached target Multi-User System.
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 02 22:51:58 np0005542927.novalocal rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 22:51:58 np0005542927.novalocal kdumpctl[1016]: kdump: No kdump initial ramdisk found.
Dec 02 22:51:58 np0005542927.novalocal kdumpctl[1016]: kdump: Rebuilding /boot/initramfs-5.14.0-645.el9.x86_64kdump.img
Dec 02 22:51:58 np0005542927.novalocal cloud-init[1068]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Tue, 02 Dec 2025 22:51:58 +0000. Up 9.28 seconds.
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Dec 02 22:51:58 np0005542927.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Dec 02 22:51:59 np0005542927.novalocal cloud-init[1225]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Tue, 02 Dec 2025 22:51:59 +0000. Up 9.72 seconds.
Dec 02 22:51:59 np0005542927.novalocal sshd-session[1227]: Connection closed by 38.102.83.114 port 49536 [preauth]
Dec 02 22:51:59 np0005542927.novalocal sshd-session[1241]: Unable to negotiate with 38.102.83.114 port 49546: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 02 22:51:59 np0005542927.novalocal cloud-init[1254]: #############################################################
Dec 02 22:51:59 np0005542927.novalocal cloud-init[1259]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 02 22:51:59 np0005542927.novalocal cloud-init[1265]: 256 SHA256:qfewvzqEBkVHh2/hOi2TgqNMwEdGJSanSIdhW7evX/4 root@np0005542927.novalocal (ECDSA)
Dec 02 22:51:59 np0005542927.novalocal sshd-session[1263]: Unable to negotiate with 38.102.83.114 port 49564: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 02 22:51:59 np0005542927.novalocal cloud-init[1274]: 256 SHA256:uvLJi7lrtXN2aS8sBBtLcAkp72k1t47OvvQS/RfRHP8 root@np0005542927.novalocal (ED25519)
Dec 02 22:51:59 np0005542927.novalocal cloud-init[1280]: 3072 SHA256:wVdZIJSaXVB5m/BuTEn6JtCiWRL+X80UWlGx5e2496k root@np0005542927.novalocal (RSA)
Dec 02 22:51:59 np0005542927.novalocal sshd-session[1275]: Unable to negotiate with 38.102.83.114 port 49578: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 02 22:51:59 np0005542927.novalocal cloud-init[1282]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 02 22:51:59 np0005542927.novalocal cloud-init[1284]: #############################################################
Dec 02 22:51:59 np0005542927.novalocal dracut[1292]: dracut-057-102.git20250818.el9
Dec 02 22:51:59 np0005542927.novalocal sshd-session[1248]: Connection closed by 38.102.83.114 port 49558 [preauth]
Dec 02 22:51:59 np0005542927.novalocal cloud-init[1225]: Cloud-init v. 24.4-7.el9 finished at Tue, 02 Dec 2025 22:51:59 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.95 seconds
Dec 02 22:51:59 np0005542927.novalocal sshd-session[1308]: Unable to negotiate with 38.102.83.114 port 49604: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Dec 02 22:51:59 np0005542927.novalocal sshd-session[1312]: Unable to negotiate with 38.102.83.114 port 49618: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 02 22:51:59 np0005542927.novalocal sshd-session[1283]: Connection closed by 38.102.83.114 port 49590 [preauth]
Dec 02 22:51:59 np0005542927.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Dec 02 22:51:59 np0005542927.novalocal systemd[1]: Reached target Cloud-init target.
Dec 02 22:51:59 np0005542927.novalocal sshd-session[1293]: Connection closed by 38.102.83.114 port 49600 [preauth]
Dec 02 22:51:59 np0005542927.novalocal dracut[1295]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-645.el9.x86_64kdump.img 5.14.0-645.el9.x86_64
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: Module 'resume' will not be installed, because it's in the list to be omitted!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: memstrack is not available
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 02 22:52:00 np0005542927.novalocal dracut[1295]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: memstrack is not available
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: *** Including module: systemd ***
Dec 02 22:52:01 np0005542927.novalocal chronyd[785]: Selected source 216.197.228.230 (2.centos.pool.ntp.org)
Dec 02 22:52:01 np0005542927.novalocal chronyd[785]: System clock TAI offset set to 37 seconds
Dec 02 22:52:01 np0005542927.novalocal dracut[1295]: *** Including module: fips ***
Dec 02 22:52:02 np0005542927.novalocal dracut[1295]: *** Including module: systemd-initrd ***
Dec 02 22:52:02 np0005542927.novalocal dracut[1295]: *** Including module: i18n ***
Dec 02 22:52:02 np0005542927.novalocal dracut[1295]: *** Including module: drm ***
Dec 02 22:52:02 np0005542927.novalocal dracut[1295]: *** Including module: prefixdevname ***
Dec 02 22:52:02 np0005542927.novalocal dracut[1295]: *** Including module: kernel-modules ***
Dec 02 22:52:03 np0005542927.novalocal kernel: block vda: the capability attribute has been deprecated.
Dec 02 22:52:03 np0005542927.novalocal dracut[1295]: *** Including module: kernel-modules-extra ***
Dec 02 22:52:03 np0005542927.novalocal dracut[1295]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 02 22:52:03 np0005542927.novalocal dracut[1295]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 02 22:52:03 np0005542927.novalocal dracut[1295]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 02 22:52:03 np0005542927.novalocal dracut[1295]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 02 22:52:03 np0005542927.novalocal dracut[1295]: *** Including module: qemu ***
Dec 02 22:52:03 np0005542927.novalocal dracut[1295]: *** Including module: fstab-sys ***
Dec 02 22:52:03 np0005542927.novalocal dracut[1295]: *** Including module: rootfs-block ***
Dec 02 22:52:03 np0005542927.novalocal dracut[1295]: *** Including module: terminfo ***
Dec 02 22:52:03 np0005542927.novalocal dracut[1295]: *** Including module: udev-rules ***
Dec 02 22:52:04 np0005542927.novalocal dracut[1295]: Skipping udev rule: 91-permissions.rules
Dec 02 22:52:04 np0005542927.novalocal dracut[1295]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 02 22:52:04 np0005542927.novalocal dracut[1295]: *** Including module: virtiofs ***
Dec 02 22:52:04 np0005542927.novalocal dracut[1295]: *** Including module: dracut-systemd ***
Dec 02 22:52:04 np0005542927.novalocal dracut[1295]: *** Including module: usrmount ***
Dec 02 22:52:04 np0005542927.novalocal dracut[1295]: *** Including module: base ***
Dec 02 22:52:04 np0005542927.novalocal dracut[1295]: *** Including module: fs-lib ***
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]: *** Including module: kdumpbase ***
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:   microcode_ctl module: mangling fw_dir
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: configuration "intel" is ignored
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 02 22:52:05 np0005542927.novalocal dracut[1295]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 02 22:52:06 np0005542927.novalocal dracut[1295]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 02 22:52:06 np0005542927.novalocal dracut[1295]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 02 22:52:06 np0005542927.novalocal dracut[1295]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 02 22:52:06 np0005542927.novalocal dracut[1295]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec 02 22:52:06 np0005542927.novalocal dracut[1295]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec 02 22:52:06 np0005542927.novalocal dracut[1295]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec 02 22:52:06 np0005542927.novalocal dracut[1295]: *** Including module: openssl ***
Dec 02 22:52:06 np0005542927.novalocal dracut[1295]: *** Including module: shutdown ***
Dec 02 22:52:06 np0005542927.novalocal dracut[1295]: *** Including module: squash ***
Dec 02 22:52:06 np0005542927.novalocal irqbalance[791]: Cannot change IRQ 25 affinity: Operation not permitted
Dec 02 22:52:06 np0005542927.novalocal irqbalance[791]: IRQ 25 affinity is now unmanaged
Dec 02 22:52:06 np0005542927.novalocal irqbalance[791]: Cannot change IRQ 31 affinity: Operation not permitted
Dec 02 22:52:06 np0005542927.novalocal irqbalance[791]: IRQ 31 affinity is now unmanaged
Dec 02 22:52:06 np0005542927.novalocal irqbalance[791]: Cannot change IRQ 28 affinity: Operation not permitted
Dec 02 22:52:06 np0005542927.novalocal irqbalance[791]: IRQ 28 affinity is now unmanaged
Dec 02 22:52:06 np0005542927.novalocal irqbalance[791]: Cannot change IRQ 32 affinity: Operation not permitted
Dec 02 22:52:06 np0005542927.novalocal irqbalance[791]: IRQ 32 affinity is now unmanaged
Dec 02 22:52:06 np0005542927.novalocal irqbalance[791]: Cannot change IRQ 30 affinity: Operation not permitted
Dec 02 22:52:06 np0005542927.novalocal irqbalance[791]: IRQ 30 affinity is now unmanaged
Dec 02 22:52:06 np0005542927.novalocal irqbalance[791]: Cannot change IRQ 29 affinity: Operation not permitted
Dec 02 22:52:06 np0005542927.novalocal irqbalance[791]: IRQ 29 affinity is now unmanaged
Dec 02 22:52:06 np0005542927.novalocal dracut[1295]: *** Including modules done ***
Dec 02 22:52:06 np0005542927.novalocal dracut[1295]: *** Installing kernel module dependencies ***
Dec 02 22:52:06 np0005542927.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 22:52:07 np0005542927.novalocal dracut[1295]: *** Installing kernel module dependencies done ***
Dec 02 22:52:07 np0005542927.novalocal dracut[1295]: *** Resolving executable dependencies ***
Dec 02 22:52:08 np0005542927.novalocal dracut[1295]: *** Resolving executable dependencies done ***
Dec 02 22:52:08 np0005542927.novalocal dracut[1295]: *** Generating early-microcode cpio image ***
Dec 02 22:52:08 np0005542927.novalocal dracut[1295]: *** Store current command line parameters ***
Dec 02 22:52:08 np0005542927.novalocal dracut[1295]: Stored kernel commandline:
Dec 02 22:52:08 np0005542927.novalocal dracut[1295]: No dracut internal kernel commandline stored in the initramfs
Dec 02 22:52:09 np0005542927.novalocal dracut[1295]: *** Install squash loader ***
Dec 02 22:52:10 np0005542927.novalocal dracut[1295]: *** Squashing the files inside the initramfs ***
Dec 02 22:52:11 np0005542927.novalocal dracut[1295]: *** Squashing the files inside the initramfs done ***
Dec 02 22:52:11 np0005542927.novalocal dracut[1295]: *** Creating image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' ***
Dec 02 22:52:11 np0005542927.novalocal dracut[1295]: *** Hardlinking files ***
Dec 02 22:52:11 np0005542927.novalocal dracut[1295]: Mode:           real
Dec 02 22:52:11 np0005542927.novalocal dracut[1295]: Files:          50
Dec 02 22:52:11 np0005542927.novalocal dracut[1295]: Linked:         0 files
Dec 02 22:52:11 np0005542927.novalocal dracut[1295]: Compared:       0 xattrs
Dec 02 22:52:11 np0005542927.novalocal dracut[1295]: Compared:       0 files
Dec 02 22:52:11 np0005542927.novalocal dracut[1295]: Saved:          0 B
Dec 02 22:52:11 np0005542927.novalocal dracut[1295]: Duration:       0.000898 seconds
Dec 02 22:52:11 np0005542927.novalocal dracut[1295]: *** Hardlinking files done ***
Dec 02 22:52:11 np0005542927.novalocal dracut[1295]: *** Creating initramfs image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' done ***
Dec 02 22:52:12 np0005542927.novalocal kdumpctl[1016]: kdump: kexec: loaded kdump kernel
Dec 02 22:52:12 np0005542927.novalocal kdumpctl[1016]: kdump: Starting kdump: [OK]
Dec 02 22:52:12 np0005542927.novalocal systemd[1]: Finished Crash recovery kernel arming.
Dec 02 22:52:12 np0005542927.novalocal systemd[1]: Startup finished in 1.749s (kernel) + 2.840s (initrd) + 18.268s (userspace) = 22.858s.
Dec 02 22:52:26 np0005542927.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 02 22:53:07 np0005542927.novalocal chronyd[785]: Selected source 167.160.187.12 (2.centos.pool.ntp.org)
Dec 02 22:53:31 np0005542927.novalocal sshd-session[4296]: Accepted publickey for zuul from 38.102.83.114 port 37884 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 02 22:53:31 np0005542927.novalocal systemd[1]: Created slice User Slice of UID 1000.
Dec 02 22:53:31 np0005542927.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 02 22:53:31 np0005542927.novalocal systemd-logind[795]: New session 1 of user zuul.
Dec 02 22:53:31 np0005542927.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 02 22:53:31 np0005542927.novalocal systemd[1]: Starting User Manager for UID 1000...
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Queued start job for default target Main User Target.
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Created slice User Application Slice.
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Reached target Paths.
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Reached target Timers.
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Starting D-Bus User Message Bus Socket...
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Starting Create User's Volatile Files and Directories...
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Finished Create User's Volatile Files and Directories.
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Listening on D-Bus User Message Bus Socket.
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Reached target Sockets.
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Reached target Basic System.
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Reached target Main User Target.
Dec 02 22:53:31 np0005542927.novalocal systemd[4300]: Startup finished in 152ms.
Dec 02 22:53:31 np0005542927.novalocal systemd[1]: Started User Manager for UID 1000.
Dec 02 22:53:31 np0005542927.novalocal systemd[1]: Started Session 1 of User zuul.
Dec 02 22:53:31 np0005542927.novalocal sshd-session[4296]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 22:53:32 np0005542927.novalocal python3[4382]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 22:53:34 np0005542927.novalocal python3[4410]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 22:53:41 np0005542927.novalocal python3[4468]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 22:53:43 np0005542927.novalocal python3[4508]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 02 22:53:45 np0005542927.novalocal python3[4534]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD4U+kYD0NRncUxX+JcUB7DRqILQVJDR8uhD3ks0Ul/EeWgVXEffIQ6qoDCZfMe9U2/TbSPjeQxRgrxfeojshOb90kAetShddcGy8qV3/MPy+wQVVV1rYoL3quzM5Aq/+yxxiNhtzetxzg9+fYQ2RCmPT7lduoZwxU6u936ZxDFI68NvtLWahzQ+M1heDP7uxDZQ9tlgqT8eJifbx4ZTmiC+l6jhgkBDnXRH6h2kIx0R+gfYMaSYKEzVhzG6w1nXZR6xFPAwbbZcUIMhMqO/V4dbStGfEcoodG69OYWlpuu93qNro9BrqffizKa2kknlLLFvgueRkCId9XrXN6p7IuzIjTV0h3DIY3uLZmygElHHoI8eKSOWIxwqSq866WuVWi4PlJY4XAhlllCpoOs8C9ugUhElgy1QCzbZerHV5m5m+Z42nwFUfeUQFIloFxeTlJIWCIXceAA1LqMifrZXliCnI20C7EFs4NwaJq4wBnBa7i/RKHSIWz3twW0fOCBh5c= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:46 np0005542927.novalocal python3[4558]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:46 np0005542927.novalocal python3[4657]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:53:47 np0005542927.novalocal python3[4728]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764716026.2789502-229-189211078204032/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=248d0ab726be451cb68f679daa67a6cc_id_rsa follow=False checksum=ecd6e0f03c6e65bc1b100de91867e2771eda57bc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:47 np0005542927.novalocal python3[4851]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:53:48 np0005542927.novalocal python3[4922]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764716027.2884536-273-61371525756907/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=248d0ab726be451cb68f679daa67a6cc_id_rsa.pub follow=False checksum=b88ecc87674b1a86abb7c76bf7f4e81098e02b1a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:49 np0005542927.novalocal python3[4970]: ansible-ping Invoked with data=pong
Dec 02 22:53:50 np0005542927.novalocal python3[4994]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 22:53:52 np0005542927.novalocal python3[5052]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 02 22:53:53 np0005542927.novalocal python3[5084]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:53 np0005542927.novalocal python3[5108]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:53 np0005542927.novalocal python3[5132]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:54 np0005542927.novalocal python3[5156]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:54 np0005542927.novalocal python3[5180]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:54 np0005542927.novalocal python3[5204]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:56 np0005542927.novalocal sudo[5228]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvchrrkeiwweayxwxmpadvezbutwkseo ; /usr/bin/python3'
Dec 02 22:53:56 np0005542927.novalocal sudo[5228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:53:56 np0005542927.novalocal python3[5230]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:56 np0005542927.novalocal sudo[5228]: pam_unix(sudo:session): session closed for user root
Dec 02 22:53:56 np0005542927.novalocal sudo[5306]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edyustkglwuotvpsxdatapiejqdzngju ; /usr/bin/python3'
Dec 02 22:53:56 np0005542927.novalocal sudo[5306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:53:56 np0005542927.novalocal python3[5308]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:53:56 np0005542927.novalocal sudo[5306]: pam_unix(sudo:session): session closed for user root
Dec 02 22:53:57 np0005542927.novalocal sudo[5379]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzsjlaqnhytqbugjgknaqwoqsgssubkk ; /usr/bin/python3'
Dec 02 22:53:57 np0005542927.novalocal sudo[5379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:53:57 np0005542927.novalocal python3[5381]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764716036.3620672-26-141868027113692/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:53:57 np0005542927.novalocal sudo[5379]: pam_unix(sudo:session): session closed for user root
Dec 02 22:53:58 np0005542927.novalocal python3[5429]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:58 np0005542927.novalocal python3[5453]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:58 np0005542927.novalocal python3[5477]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:58 np0005542927.novalocal python3[5501]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:59 np0005542927.novalocal python3[5525]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:59 np0005542927.novalocal python3[5549]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:53:59 np0005542927.novalocal python3[5573]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:00 np0005542927.novalocal python3[5597]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:00 np0005542927.novalocal python3[5621]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:00 np0005542927.novalocal python3[5645]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:01 np0005542927.novalocal python3[5669]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:01 np0005542927.novalocal python3[5693]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:01 np0005542927.novalocal python3[5717]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:01 np0005542927.novalocal python3[5741]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:02 np0005542927.novalocal python3[5765]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:02 np0005542927.novalocal python3[5789]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:02 np0005542927.novalocal python3[5813]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:03 np0005542927.novalocal python3[5837]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:03 np0005542927.novalocal python3[5861]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:03 np0005542927.novalocal python3[5885]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:03 np0005542927.novalocal python3[5909]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:04 np0005542927.novalocal python3[5933]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:04 np0005542927.novalocal python3[5957]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:04 np0005542927.novalocal python3[5981]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:05 np0005542927.novalocal python3[6005]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:05 np0005542927.novalocal python3[6029]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 22:54:07 np0005542927.novalocal sudo[6053]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjckkqihnfnlzmquzmzvthbttwuuuhwx ; /usr/bin/python3'
Dec 02 22:54:07 np0005542927.novalocal sudo[6053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:08 np0005542927.novalocal python3[6055]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 02 22:54:08 np0005542927.novalocal systemd[1]: Starting Time & Date Service...
Dec 02 22:54:08 np0005542927.novalocal systemd[1]: Started Time & Date Service.
Dec 02 22:54:08 np0005542927.novalocal systemd-timedated[6057]: Changed time zone to 'UTC' (UTC).
Dec 02 22:54:08 np0005542927.novalocal sudo[6053]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:08 np0005542927.novalocal sudo[6084]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkbvdkadewfbmndezmffpbyyftsrjsuq ; /usr/bin/python3'
Dec 02 22:54:08 np0005542927.novalocal sudo[6084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:08 np0005542927.novalocal python3[6086]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:08 np0005542927.novalocal sudo[6084]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:09 np0005542927.novalocal python3[6162]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:54:09 np0005542927.novalocal python3[6233]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764716048.8244057-202-51506674544094/source _original_basename=tmpheies422 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:10 np0005542927.novalocal python3[6333]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:54:10 np0005542927.novalocal python3[6404]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764716049.7407172-242-115453363431350/source _original_basename=tmpg6satibk follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:11 np0005542927.novalocal sudo[6504]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dywnvukjhzfwdeztnpzstxgxtlmvfood ; /usr/bin/python3'
Dec 02 22:54:11 np0005542927.novalocal sudo[6504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:11 np0005542927.novalocal python3[6506]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:54:11 np0005542927.novalocal sudo[6504]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:11 np0005542927.novalocal sudo[6577]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqhobernecenqlblnfpnkahjulglcswn ; /usr/bin/python3'
Dec 02 22:54:11 np0005542927.novalocal sudo[6577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:11 np0005542927.novalocal python3[6579]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764716050.9275863-306-31329618578780/source _original_basename=tmpgtkilw1v follow=False checksum=b5d32a20a180d280e10f96c8ee4e4addc6022f99 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:11 np0005542927.novalocal sudo[6577]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:12 np0005542927.novalocal python3[6627]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 22:54:12 np0005542927.novalocal python3[6653]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 22:54:12 np0005542927.novalocal sudo[6731]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbhlgswlobmoqswgkbpafvplgubcslei ; /usr/bin/python3'
Dec 02 22:54:12 np0005542927.novalocal sudo[6731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:13 np0005542927.novalocal python3[6733]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:54:13 np0005542927.novalocal sudo[6731]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:13 np0005542927.novalocal sudo[6804]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxdkkfynidhavuxtedgqrhxrsxsofrkt ; /usr/bin/python3'
Dec 02 22:54:13 np0005542927.novalocal sudo[6804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:13 np0005542927.novalocal python3[6806]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764716052.737652-362-218406961449248/source _original_basename=tmp5tnz4tp1 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:13 np0005542927.novalocal sudo[6804]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:13 np0005542927.novalocal sudo[6855]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxqpncmnvcdoqmyhkuliqudhqcrsqyzo ; /usr/bin/python3'
Dec 02 22:54:13 np0005542927.novalocal sudo[6855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:14 np0005542927.novalocal python3[6857]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-8d85-4712-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 22:54:14 np0005542927.novalocal sudo[6855]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:14 np0005542927.novalocal python3[6885]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-8d85-4712-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 02 22:54:16 np0005542927.novalocal python3[6913]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:33 np0005542927.novalocal sudo[6937]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpqqxnjpteurobxwhidpguurwvhitswy ; /usr/bin/python3'
Dec 02 22:54:33 np0005542927.novalocal sudo[6937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:54:33 np0005542927.novalocal python3[6939]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:54:33 np0005542927.novalocal sudo[6937]: pam_unix(sudo:session): session closed for user root
Dec 02 22:54:38 np0005542927.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 02 22:55:08 np0005542927.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 02 22:55:08 np0005542927.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec 02 22:55:08 np0005542927.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec 02 22:55:08 np0005542927.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec 02 22:55:08 np0005542927.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec 02 22:55:08 np0005542927.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec 02 22:55:08 np0005542927.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec 02 22:55:08 np0005542927.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec 02 22:55:08 np0005542927.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec 02 22:55:08 np0005542927.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 02 22:55:08 np0005542927.novalocal NetworkManager[860]: <info>  [1764716108.5553] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 02 22:55:08 np0005542927.novalocal systemd-udevd[6943]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 22:55:08 np0005542927.novalocal NetworkManager[860]: <info>  [1764716108.5692] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 22:55:08 np0005542927.novalocal NetworkManager[860]: <info>  [1764716108.5719] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 02 22:55:08 np0005542927.novalocal NetworkManager[860]: <info>  [1764716108.5722] device (eth1): carrier: link connected
Dec 02 22:55:08 np0005542927.novalocal NetworkManager[860]: <info>  [1764716108.5724] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 02 22:55:08 np0005542927.novalocal NetworkManager[860]: <info>  [1764716108.5730] policy: auto-activating connection 'Wired connection 1' (c810458a-fafa-397b-a0ff-2bc5f1fe918f)
Dec 02 22:55:08 np0005542927.novalocal NetworkManager[860]: <info>  [1764716108.5734] device (eth1): Activation: starting connection 'Wired connection 1' (c810458a-fafa-397b-a0ff-2bc5f1fe918f)
Dec 02 22:55:08 np0005542927.novalocal NetworkManager[860]: <info>  [1764716108.5735] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 22:55:08 np0005542927.novalocal NetworkManager[860]: <info>  [1764716108.5738] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 22:55:08 np0005542927.novalocal NetworkManager[860]: <info>  [1764716108.5742] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 22:55:08 np0005542927.novalocal NetworkManager[860]: <info>  [1764716108.5746] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 02 22:55:09 np0005542927.novalocal python3[6969]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-ab69-8977-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 22:55:16 np0005542927.novalocal sudo[7047]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iijccdoevksjdnlbqhjiuybuxpzguqps ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 02 22:55:16 np0005542927.novalocal sudo[7047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:55:16 np0005542927.novalocal python3[7049]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:55:16 np0005542927.novalocal sudo[7047]: pam_unix(sudo:session): session closed for user root
Dec 02 22:55:16 np0005542927.novalocal sudo[7120]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nctkhgvzhdjpjvwzumqjfgabiluhlnei ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 02 22:55:16 np0005542927.novalocal sudo[7120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:55:17 np0005542927.novalocal python3[7122]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764716116.2762117-103-92536136124825/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=370a13671f8cdc917fa48d86ab839efbe2067510 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:55:17 np0005542927.novalocal sudo[7120]: pam_unix(sudo:session): session closed for user root
Dec 02 22:55:17 np0005542927.novalocal sudo[7170]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxqcxhyidmwfkdfrhbgebtwwinoupdgf ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 02 22:55:17 np0005542927.novalocal sudo[7170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:55:17 np0005542927.novalocal python3[7172]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 22:55:17 np0005542927.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 02 22:55:17 np0005542927.novalocal systemd[1]: Stopped Network Manager Wait Online.
Dec 02 22:55:17 np0005542927.novalocal systemd[1]: Stopping Network Manager Wait Online...
Dec 02 22:55:17 np0005542927.novalocal systemd[1]: Stopping Network Manager...
Dec 02 22:55:17 np0005542927.novalocal NetworkManager[860]: <info>  [1764716117.9556] caught SIGTERM, shutting down normally.
Dec 02 22:55:17 np0005542927.novalocal NetworkManager[860]: <info>  [1764716117.9569] dhcp4 (eth0): canceled DHCP transaction
Dec 02 22:55:17 np0005542927.novalocal NetworkManager[860]: <info>  [1764716117.9569] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 22:55:17 np0005542927.novalocal NetworkManager[860]: <info>  [1764716117.9570] dhcp4 (eth0): state changed no lease
Dec 02 22:55:17 np0005542927.novalocal NetworkManager[860]: <info>  [1764716117.9573] manager: NetworkManager state is now CONNECTING
Dec 02 22:55:17 np0005542927.novalocal NetworkManager[860]: <info>  [1764716117.9726] dhcp4 (eth1): canceled DHCP transaction
Dec 02 22:55:17 np0005542927.novalocal NetworkManager[860]: <info>  [1764716117.9726] dhcp4 (eth1): state changed no lease
Dec 02 22:55:17 np0005542927.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 22:55:17 np0005542927.novalocal NetworkManager[860]: <info>  [1764716117.9792] exiting (success)
Dec 02 22:55:17 np0005542927.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 22:55:17 np0005542927.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 02 22:55:17 np0005542927.novalocal systemd[1]: Stopped Network Manager.
Dec 02 22:55:17 np0005542927.novalocal systemd[1]: NetworkManager.service: Consumed 1.494s CPU time, 10.2M memory peak.
Dec 02 22:55:18 np0005542927.novalocal systemd[1]: Starting Network Manager...
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.0609] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:13ba9be2-a183-422c-a29d-1c0aec36730d)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.0612] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.0683] manager[0x55e56bf55070]: monitoring kernel firmware directory '/lib/firmware'.
Dec 02 22:55:18 np0005542927.novalocal systemd[1]: Starting Hostname Service...
Dec 02 22:55:18 np0005542927.novalocal systemd[1]: Started Hostname Service.
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1625] hostname: hostname: using hostnamed
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1627] hostname: static hostname changed from (none) to "np0005542927.novalocal"
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1633] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1640] manager[0x55e56bf55070]: rfkill: Wi-Fi hardware radio set enabled
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1640] manager[0x55e56bf55070]: rfkill: WWAN hardware radio set enabled
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1682] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1683] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1684] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1684] manager: Networking is enabled by state file
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1688] settings: Loaded settings plugin: keyfile (internal)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1694] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1737] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1753] dhcp: init: Using DHCP client 'internal'
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1758] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1767] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1777] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1794] device (lo): Activation: starting connection 'lo' (c5650cb8-9795-426f-8b6b-42dce70d6cce)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1807] device (eth0): carrier: link connected
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1814] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1823] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1826] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1838] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1850] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1861] device (eth1): carrier: link connected
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1869] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1878] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (c810458a-fafa-397b-a0ff-2bc5f1fe918f) (indicated)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1880] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1890] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1902] device (eth1): Activation: starting connection 'Wired connection 1' (c810458a-fafa-397b-a0ff-2bc5f1fe918f)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1914] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 02 22:55:18 np0005542927.novalocal systemd[1]: Started Network Manager.
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1923] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1928] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1932] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1938] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1944] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1950] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1955] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1962] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1973] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.1979] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.2001] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.2012] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.2043] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.2050] dhcp4 (eth0): state changed new lease, address=38.102.83.77
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.2061] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.2074] device (lo): Activation: successful, device activated.
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.2097] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 02 22:55:18 np0005542927.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.2184] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.2258] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.2262] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.2270] manager: NetworkManager state is now CONNECTED_SITE
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.2277] device (eth0): Activation: successful, device activated.
Dec 02 22:55:18 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716118.2288] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 02 22:55:18 np0005542927.novalocal sudo[7170]: pam_unix(sudo:session): session closed for user root
Dec 02 22:55:18 np0005542927.novalocal python3[7256]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-ab69-8977-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 22:55:28 np0005542927.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 22:55:38 np0005542927.novalocal systemd[4300]: Starting Mark boot as successful...
Dec 02 22:55:38 np0005542927.novalocal systemd[4300]: Finished Mark boot as successful.
Dec 02 22:55:48 np0005542927.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.2923] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 02 22:56:03 np0005542927.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 22:56:03 np0005542927.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3268] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3269] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3274] device (eth1): Activation: successful, device activated.
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3279] manager: startup complete
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3281] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <warn>  [1764716163.3285] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3290] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec 02 22:56:03 np0005542927.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3476] dhcp4 (eth1): canceled DHCP transaction
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3476] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3476] dhcp4 (eth1): state changed no lease
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3488] policy: auto-activating connection 'ci-private-network' (2b145667-e1cd-593e-beec-410c178624e9)
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3490] device (eth1): Activation: starting connection 'ci-private-network' (2b145667-e1cd-593e-beec-410c178624e9)
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3491] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3493] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3498] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3503] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3541] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3542] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 22:56:03 np0005542927.novalocal NetworkManager[7181]: <info>  [1764716163.3547] device (eth1): Activation: successful, device activated.
Dec 02 22:56:13 np0005542927.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 22:56:18 np0005542927.novalocal sshd-session[4309]: Received disconnect from 38.102.83.114 port 37884:11: disconnected by user
Dec 02 22:56:18 np0005542927.novalocal sshd-session[4309]: Disconnected from user zuul 38.102.83.114 port 37884
Dec 02 22:56:18 np0005542927.novalocal sshd-session[4296]: pam_unix(sshd:session): session closed for user zuul
Dec 02 22:56:18 np0005542927.novalocal systemd-logind[795]: Session 1 logged out. Waiting for processes to exit.
Dec 02 22:56:41 np0005542927.novalocal sshd-session[7285]: Accepted publickey for zuul from 38.102.83.114 port 47328 ssh2: RSA SHA256:hdlXDg7PlzRXiLISnY+IUpp6Y3Jc5y9DXpVHJTD4Z4A
Dec 02 22:56:41 np0005542927.novalocal systemd-logind[795]: New session 3 of user zuul.
Dec 02 22:56:41 np0005542927.novalocal systemd[1]: Started Session 3 of User zuul.
Dec 02 22:56:41 np0005542927.novalocal sshd-session[7285]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 22:56:42 np0005542927.novalocal sudo[7364]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqhosxvvmkpjfeymxhglmxrlaccbvrrr ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 02 22:56:42 np0005542927.novalocal sudo[7364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:56:42 np0005542927.novalocal python3[7366]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 22:56:42 np0005542927.novalocal sudo[7364]: pam_unix(sudo:session): session closed for user root
Dec 02 22:56:42 np0005542927.novalocal sudo[7437]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjfgkolgxpjjhowfjgtxmmpttyiithrz ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 02 22:56:42 np0005542927.novalocal sudo[7437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 22:56:42 np0005542927.novalocal python3[7439]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764716202.100883-309-188399368644600/source _original_basename=tmp9bd63wau follow=False checksum=e4fb491a69cfdde53e5aa8dc33e934e0cc50f41b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 22:56:42 np0005542927.novalocal sudo[7437]: pam_unix(sudo:session): session closed for user root
Dec 02 22:56:46 np0005542927.novalocal sshd-session[7288]: Connection closed by 38.102.83.114 port 47328
Dec 02 22:56:46 np0005542927.novalocal sshd-session[7285]: pam_unix(sshd:session): session closed for user zuul
Dec 02 22:56:46 np0005542927.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Dec 02 22:56:46 np0005542927.novalocal systemd-logind[795]: Session 3 logged out. Waiting for processes to exit.
Dec 02 22:56:46 np0005542927.novalocal systemd-logind[795]: Removed session 3.
Dec 02 22:57:03 np0005542927.novalocal sshd-session[7464]: Received disconnect from 45.78.218.154 port 34846:11: Bye Bye [preauth]
Dec 02 22:57:03 np0005542927.novalocal sshd-session[7464]: Disconnected from authenticating user root 45.78.218.154 port 34846 [preauth]
Dec 02 22:57:47 np0005542927.novalocal sshd-session[7467]: Invalid user sshadmin from 80.94.95.116 port 35490
Dec 02 22:57:47 np0005542927.novalocal sshd-session[7467]: Connection closed by invalid user sshadmin 80.94.95.116 port 35490 [preauth]
Dec 02 22:58:38 np0005542927.novalocal systemd[4300]: Created slice User Background Tasks Slice.
Dec 02 22:58:38 np0005542927.novalocal systemd[4300]: Starting Cleanup of User's Temporary Files and Directories...
Dec 02 22:58:38 np0005542927.novalocal systemd[4300]: Finished Cleanup of User's Temporary Files and Directories.
Dec 02 23:00:11 np0005542927.novalocal sshd-session[7470]: Connection closed by 45.78.218.154 port 51116 [preauth]
Dec 02 23:01:01 np0005542927.novalocal CROND[7474]: (root) CMD (run-parts /etc/cron.hourly)
Dec 02 23:01:01 np0005542927.novalocal run-parts[7477]: (/etc/cron.hourly) starting 0anacron
Dec 02 23:01:01 np0005542927.novalocal anacron[7485]: Anacron started on 2025-12-02
Dec 02 23:01:01 np0005542927.novalocal anacron[7485]: Will run job `cron.daily' in 37 min.
Dec 02 23:01:01 np0005542927.novalocal anacron[7485]: Will run job `cron.weekly' in 57 min.
Dec 02 23:01:01 np0005542927.novalocal anacron[7485]: Will run job `cron.monthly' in 77 min.
Dec 02 23:01:01 np0005542927.novalocal anacron[7485]: Jobs will be executed sequentially
Dec 02 23:01:01 np0005542927.novalocal run-parts[7487]: (/etc/cron.hourly) finished 0anacron
Dec 02 23:01:01 np0005542927.novalocal CROND[7473]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 02 23:03:05 np0005542927.novalocal sshd-session[7491]: Accepted publickey for zuul from 38.102.83.114 port 38796 ssh2: RSA SHA256:hdlXDg7PlzRXiLISnY+IUpp6Y3Jc5y9DXpVHJTD4Z4A
Dec 02 23:03:05 np0005542927.novalocal systemd-logind[795]: New session 4 of user zuul.
Dec 02 23:03:05 np0005542927.novalocal systemd[1]: Started Session 4 of User zuul.
Dec 02 23:03:05 np0005542927.novalocal sshd-session[7491]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:03:06 np0005542927.novalocal sudo[7518]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpiggrjsctwifpxubdqhecbqkhtulfhe ; /usr/bin/python3'
Dec 02 23:03:06 np0005542927.novalocal sudo[7518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:06 np0005542927.novalocal python3[7520]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-3fa7-c4d8-000000001cd7-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:03:06 np0005542927.novalocal sudo[7518]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:06 np0005542927.novalocal sudo[7546]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgzttqzsvqnqqkjpbybqunkcvjqyuwsd ; /usr/bin/python3'
Dec 02 23:03:06 np0005542927.novalocal sudo[7546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:06 np0005542927.novalocal python3[7548]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:03:06 np0005542927.novalocal sudo[7546]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:06 np0005542927.novalocal sudo[7573]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztzowgzlrslsqctcabpdwmdgabhhbajb ; /usr/bin/python3'
Dec 02 23:03:06 np0005542927.novalocal sudo[7573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:06 np0005542927.novalocal python3[7575]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:03:06 np0005542927.novalocal sudo[7573]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:07 np0005542927.novalocal sudo[7599]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdrjbzmaediksgyqxgmefivrxqaooozb ; /usr/bin/python3'
Dec 02 23:03:07 np0005542927.novalocal sudo[7599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:07 np0005542927.novalocal python3[7601]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:03:07 np0005542927.novalocal sudo[7599]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:07 np0005542927.novalocal sudo[7625]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okzoczyfwbefowdyqkkldzqaylktfeam ; /usr/bin/python3'
Dec 02 23:03:07 np0005542927.novalocal sudo[7625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:07 np0005542927.novalocal python3[7627]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:03:07 np0005542927.novalocal sudo[7625]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:07 np0005542927.novalocal sudo[7651]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdwluadzayigjbnnfsluvhwqkmozkmwm ; /usr/bin/python3'
Dec 02 23:03:07 np0005542927.novalocal sudo[7651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:07 np0005542927.novalocal python3[7653]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:03:07 np0005542927.novalocal sudo[7651]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:08 np0005542927.novalocal sudo[7729]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaxiyixuwtooxihcnvbpowkouhrsxruj ; /usr/bin/python3'
Dec 02 23:03:08 np0005542927.novalocal sudo[7729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:08 np0005542927.novalocal python3[7731]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:03:08 np0005542927.novalocal sudo[7729]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:08 np0005542927.novalocal sudo[7802]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-entluvvldwwemtvmwopimkwzwtknbqut ; /usr/bin/python3'
Dec 02 23:03:08 np0005542927.novalocal sudo[7802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:08 np0005542927.novalocal python3[7804]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764716588.2534013-494-133782869291312/source _original_basename=tmp4wo42mdx follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:03:08 np0005542927.novalocal sudo[7802]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:09 np0005542927.novalocal sudo[7852]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woejrlnidfloazuxlwmqovffkxvhcojy ; /usr/bin/python3'
Dec 02 23:03:09 np0005542927.novalocal sudo[7852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:10 np0005542927.novalocal python3[7854]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:03:10 np0005542927.novalocal systemd[1]: Reloading.
Dec 02 23:03:10 np0005542927.novalocal systemd-rc-local-generator[7874]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:03:10 np0005542927.novalocal sudo[7852]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:11 np0005542927.novalocal sudo[7909]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vimqpnpzvnjpimijqunuggxnlxpseidk ; /usr/bin/python3'
Dec 02 23:03:11 np0005542927.novalocal sudo[7909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:11 np0005542927.novalocal python3[7911]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 02 23:03:11 np0005542927.novalocal sudo[7909]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:11 np0005542927.novalocal sudo[7935]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvuzxvjfnggonbyrfxihlyoxpkmcwivs ; /usr/bin/python3'
Dec 02 23:03:11 np0005542927.novalocal sudo[7935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:12 np0005542927.novalocal python3[7937]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:03:12 np0005542927.novalocal sudo[7935]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:12 np0005542927.novalocal sudo[7963]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpzygpqgxsafoghtaorgbqsbmcyyshpw ; /usr/bin/python3'
Dec 02 23:03:12 np0005542927.novalocal sudo[7963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:12 np0005542927.novalocal python3[7965]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:03:12 np0005542927.novalocal sudo[7963]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:12 np0005542927.novalocal sudo[7991]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwjywdqsetunjzcbjiizvjfznaifgcdr ; /usr/bin/python3'
Dec 02 23:03:12 np0005542927.novalocal sudo[7991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:12 np0005542927.novalocal python3[7993]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:03:12 np0005542927.novalocal sudo[7991]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:12 np0005542927.novalocal sudo[8019]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnpoenbwuplpzneuncjwliexvaqcanci ; /usr/bin/python3'
Dec 02 23:03:12 np0005542927.novalocal sudo[8019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:12 np0005542927.novalocal python3[8021]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:03:12 np0005542927.novalocal sudo[8019]: pam_unix(sudo:session): session closed for user root
Dec 02 23:03:13 np0005542927.novalocal python3[8048]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-3fa7-c4d8-000000001cde-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:03:14 np0005542927.novalocal python3[8078]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 02 23:03:16 np0005542927.novalocal sshd-session[7494]: Connection closed by 38.102.83.114 port 38796
Dec 02 23:03:16 np0005542927.novalocal sshd-session[7491]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:03:16 np0005542927.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Dec 02 23:03:16 np0005542927.novalocal systemd[1]: session-4.scope: Consumed 4.864s CPU time.
Dec 02 23:03:16 np0005542927.novalocal systemd-logind[795]: Session 4 logged out. Waiting for processes to exit.
Dec 02 23:03:16 np0005542927.novalocal systemd-logind[795]: Removed session 4.
Dec 02 23:03:18 np0005542927.novalocal sshd-session[8083]: Accepted publickey for zuul from 38.102.83.114 port 51356 ssh2: RSA SHA256:hdlXDg7PlzRXiLISnY+IUpp6Y3Jc5y9DXpVHJTD4Z4A
Dec 02 23:03:18 np0005542927.novalocal systemd-logind[795]: New session 5 of user zuul.
Dec 02 23:03:18 np0005542927.novalocal systemd[1]: Started Session 5 of User zuul.
Dec 02 23:03:18 np0005542927.novalocal sshd-session[8083]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:03:18 np0005542927.novalocal sudo[8110]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ammoddghszdcoqukxxftifeceqclptbx ; /usr/bin/python3'
Dec 02 23:03:18 np0005542927.novalocal sudo[8110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:03:18 np0005542927.novalocal python3[8112]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 02 23:03:31 np0005542927.novalocal kernel: SELinux:  Converting 386 SID table entries...
Dec 02 23:03:31 np0005542927.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:03:31 np0005542927.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 02 23:03:31 np0005542927.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:03:31 np0005542927.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:03:31 np0005542927.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:03:31 np0005542927.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:03:31 np0005542927.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:03:41 np0005542927.novalocal kernel: SELinux:  Converting 386 SID table entries...
Dec 02 23:03:41 np0005542927.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:03:41 np0005542927.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 02 23:03:41 np0005542927.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:03:41 np0005542927.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:03:41 np0005542927.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:03:41 np0005542927.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:03:41 np0005542927.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:03:51 np0005542927.novalocal kernel: SELinux:  Converting 386 SID table entries...
Dec 02 23:03:51 np0005542927.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:03:51 np0005542927.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 02 23:03:51 np0005542927.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:03:51 np0005542927.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:03:51 np0005542927.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:03:51 np0005542927.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:03:51 np0005542927.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:03:52 np0005542927.novalocal setsebool[8175]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 02 23:03:52 np0005542927.novalocal setsebool[8175]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 02 23:04:04 np0005542927.novalocal kernel: SELinux:  Converting 389 SID table entries...
Dec 02 23:04:04 np0005542927.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:04:04 np0005542927.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 02 23:04:04 np0005542927.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:04:04 np0005542927.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:04:04 np0005542927.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:04:04 np0005542927.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:04:04 np0005542927.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:04:11 np0005542927.novalocal sshd-session[8890]: Invalid user user from 78.128.112.74 port 38504
Dec 02 23:04:11 np0005542927.novalocal sshd-session[8890]: Connection closed by invalid user user 78.128.112.74 port 38504 [preauth]
Dec 02 23:04:21 np0005542927.novalocal dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 02 23:04:21 np0005542927.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:04:21 np0005542927.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:04:21 np0005542927.novalocal systemd[1]: Reloading.
Dec 02 23:04:21 np0005542927.novalocal systemd-rc-local-generator[8932]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:04:21 np0005542927.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:04:22 np0005542927.novalocal sudo[8110]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:26 np0005542927.novalocal python3[12051]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ec2-ffbe-6827-230b-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:04:26 np0005542927.novalocal kernel: evm: overlay not supported
Dec 02 23:04:26 np0005542927.novalocal systemd[4300]: Starting D-Bus User Message Bus...
Dec 02 23:04:26 np0005542927.novalocal dbus-broker-launch[12675]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 02 23:04:26 np0005542927.novalocal systemd[4300]: Started D-Bus User Message Bus.
Dec 02 23:04:26 np0005542927.novalocal dbus-broker-launch[12675]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 02 23:04:26 np0005542927.novalocal dbus-broker-lau[12675]: Ready
Dec 02 23:04:26 np0005542927.novalocal systemd[4300]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 02 23:04:26 np0005542927.novalocal systemd[4300]: Created slice Slice /user.
Dec 02 23:04:26 np0005542927.novalocal systemd[4300]: podman-12552.scope: unit configures an IP firewall, but not running as root.
Dec 02 23:04:26 np0005542927.novalocal systemd[4300]: (This warning is only shown for the first unit using IP firewalling.)
Dec 02 23:04:26 np0005542927.novalocal systemd[4300]: Started podman-12552.scope.
Dec 02 23:04:27 np0005542927.novalocal systemd[4300]: Started podman-pause-743992df.scope.
Dec 02 23:04:27 np0005542927.novalocal sudo[13163]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owjjjsbmvxbuzbjjsqsaydduoicefdkz ; /usr/bin/python3'
Dec 02 23:04:27 np0005542927.novalocal sudo[13163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:27 np0005542927.novalocal python3[13184]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.2:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.2:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:04:27 np0005542927.novalocal python3[13184]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec 02 23:04:27 np0005542927.novalocal sudo[13163]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:28 np0005542927.novalocal sshd-session[8086]: Connection closed by 38.102.83.114 port 51356
Dec 02 23:04:28 np0005542927.novalocal sshd-session[8083]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:04:28 np0005542927.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Dec 02 23:04:28 np0005542927.novalocal systemd[1]: session-5.scope: Consumed 1min 2.507s CPU time.
Dec 02 23:04:28 np0005542927.novalocal systemd-logind[795]: Session 5 logged out. Waiting for processes to exit.
Dec 02 23:04:28 np0005542927.novalocal systemd-logind[795]: Removed session 5.
Dec 02 23:04:36 np0005542927.novalocal irqbalance[791]: Cannot change IRQ 27 affinity: Operation not permitted
Dec 02 23:04:36 np0005542927.novalocal irqbalance[791]: IRQ 27 affinity is now unmanaged
Dec 02 23:04:47 np0005542927.novalocal sshd-session[19412]: Unable to negotiate with 38.102.83.66 port 35308: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 02 23:04:47 np0005542927.novalocal sshd-session[19415]: Connection closed by 38.102.83.66 port 35300 [preauth]
Dec 02 23:04:47 np0005542927.novalocal sshd-session[19421]: Connection closed by 38.102.83.66 port 35292 [preauth]
Dec 02 23:04:47 np0005542927.novalocal sshd-session[19418]: Unable to negotiate with 38.102.83.66 port 35322: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 02 23:04:47 np0005542927.novalocal sshd-session[19417]: Unable to negotiate with 38.102.83.66 port 35338: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 02 23:04:52 np0005542927.novalocal sshd-session[20748]: Accepted publickey for zuul from 38.102.83.114 port 47888 ssh2: RSA SHA256:hdlXDg7PlzRXiLISnY+IUpp6Y3Jc5y9DXpVHJTD4Z4A
Dec 02 23:04:52 np0005542927.novalocal systemd-logind[795]: New session 6 of user zuul.
Dec 02 23:04:52 np0005542927.novalocal systemd[1]: Started Session 6 of User zuul.
Dec 02 23:04:52 np0005542927.novalocal sshd-session[20748]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:04:52 np0005542927.novalocal python3[20848]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN59DNT3Ni5luGimbJB902j8ywAXk/V0moDqx3ShASHiCOzoT242Be+x+X2vIUoDwfIddRBT8pqsU1aeIxWrMFc= zuul@np0005542926.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 23:04:52 np0005542927.novalocal sudo[20998]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eameqtecowadckriwcflpofamqfodbgq ; /usr/bin/python3'
Dec 02 23:04:52 np0005542927.novalocal sudo[20998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:53 np0005542927.novalocal python3[21008]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN59DNT3Ni5luGimbJB902j8ywAXk/V0moDqx3ShASHiCOzoT242Be+x+X2vIUoDwfIddRBT8pqsU1aeIxWrMFc= zuul@np0005542926.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 23:04:53 np0005542927.novalocal sudo[20998]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:53 np0005542927.novalocal sudo[21316]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idxtmoxwjvzmqxfdecsxhpuscgbtunon ; /usr/bin/python3'
Dec 02 23:04:53 np0005542927.novalocal sudo[21316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:53 np0005542927.novalocal python3[21327]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005542927.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 02 23:04:53 np0005542927.novalocal useradd[21395]: new group: name=cloud-admin, GID=1002
Dec 02 23:04:53 np0005542927.novalocal useradd[21395]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Dec 02 23:04:54 np0005542927.novalocal sudo[21316]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:54 np0005542927.novalocal sudo[21506]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrkgkygrxrgmzptlbhtembuvvybbyfuq ; /usr/bin/python3'
Dec 02 23:04:54 np0005542927.novalocal sudo[21506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:54 np0005542927.novalocal python3[21518]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN59DNT3Ni5luGimbJB902j8ywAXk/V0moDqx3ShASHiCOzoT242Be+x+X2vIUoDwfIddRBT8pqsU1aeIxWrMFc= zuul@np0005542926.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 02 23:04:54 np0005542927.novalocal sudo[21506]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:54 np0005542927.novalocal sudo[21739]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmpncitdbhbytqjhumczyibibmlzmztl ; /usr/bin/python3'
Dec 02 23:04:54 np0005542927.novalocal sudo[21739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:54 np0005542927.novalocal python3[21748]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:04:54 np0005542927.novalocal sudo[21739]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:55 np0005542927.novalocal sudo[21977]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhlcwwgpuoepbyvjcywvykychwvzsnaw ; /usr/bin/python3'
Dec 02 23:04:55 np0005542927.novalocal sudo[21977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:55 np0005542927.novalocal python3[21989]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764716694.5915189-151-261655786795298/source _original_basename=tmpij6694pd follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:04:55 np0005542927.novalocal sudo[21977]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:56 np0005542927.novalocal sudo[22234]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kctbgjkzgpzlmkdmexykppwbdtopgqpu ; /usr/bin/python3'
Dec 02 23:04:56 np0005542927.novalocal sudo[22234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:04:56 np0005542927.novalocal python3[22244]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Dec 02 23:04:56 np0005542927.novalocal systemd[1]: Starting Hostname Service...
Dec 02 23:04:56 np0005542927.novalocal systemd[1]: Started Hostname Service.
Dec 02 23:04:56 np0005542927.novalocal systemd-hostnamed[22322]: Changed pretty hostname to 'compute-0'
Dec 02 23:04:56 compute-0 systemd-hostnamed[22322]: Hostname set to <compute-0> (static)
Dec 02 23:04:56 compute-0 NetworkManager[7181]: <info>  [1764716696.4862] hostname: static hostname changed from "np0005542927.novalocal" to "compute-0"
Dec 02 23:04:56 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 23:04:56 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 23:04:56 compute-0 sudo[22234]: pam_unix(sudo:session): session closed for user root
Dec 02 23:04:57 compute-0 sshd-session[20797]: Connection closed by 38.102.83.114 port 47888
Dec 02 23:04:57 compute-0 sshd-session[20748]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:04:57 compute-0 systemd-logind[795]: Session 6 logged out. Waiting for processes to exit.
Dec 02 23:04:57 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Dec 02 23:04:57 compute-0 systemd[1]: session-6.scope: Consumed 2.695s CPU time.
Dec 02 23:04:57 compute-0 systemd-logind[795]: Removed session 6.
Dec 02 23:05:00 compute-0 sshd-session[22793]: Received disconnect from 45.78.218.154 port 35560:11: Bye Bye [preauth]
Dec 02 23:05:00 compute-0 sshd-session[22793]: Disconnected from authenticating user root 45.78.218.154 port 35560 [preauth]
Dec 02 23:05:06 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 23:05:22 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:05:22 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:05:22 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1min 15.571s CPU time.
Dec 02 23:05:22 compute-0 systemd[1]: run-rb26f4931f6a44920a303337bd0b738e2.service: Deactivated successfully.
Dec 02 23:05:26 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 02 23:07:27 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Dec 02 23:07:27 compute-0 sshd-session[29936]: Invalid user esuser from 45.78.218.154 port 34008
Dec 02 23:07:27 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 02 23:07:27 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Dec 02 23:07:27 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 02 23:07:29 compute-0 sshd-session[29936]: Received disconnect from 45.78.218.154 port 34008:11: Bye Bye [preauth]
Dec 02 23:07:29 compute-0 sshd-session[29936]: Disconnected from invalid user esuser 45.78.218.154 port 34008 [preauth]
Dec 02 23:08:38 compute-0 sshd-session[29943]: Accepted publickey for zuul from 38.102.83.66 port 36072 ssh2: RSA SHA256:hdlXDg7PlzRXiLISnY+IUpp6Y3Jc5y9DXpVHJTD4Z4A
Dec 02 23:08:38 compute-0 systemd-logind[795]: New session 7 of user zuul.
Dec 02 23:08:38 compute-0 systemd[1]: Started Session 7 of User zuul.
Dec 02 23:08:38 compute-0 sshd-session[29943]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:08:38 compute-0 python3[30019]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:08:41 compute-0 sudo[30133]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwqvipplrkumfyyusodazuyhitjnptnh ; /usr/bin/python3'
Dec 02 23:08:41 compute-0 sudo[30133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:41 compute-0 python3[30135]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:41 compute-0 sudo[30133]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:42 compute-0 sudo[30206]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knphnhdlwusfalurnbucygvbsizwosuh ; /usr/bin/python3'
Dec 02 23:08:42 compute-0 sudo[30206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:42 compute-0 python3[30208]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4423504-33763-190729976065255/source mode=0755 _original_basename=delorean.repo follow=False checksum=411ac78a3f8a50f4fad8cedb733e290aaaf7f3f6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:42 compute-0 sudo[30206]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:42 compute-0 sudo[30232]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aloajrbnwtmcxxbjicekdckqckhqrych ; /usr/bin/python3'
Dec 02 23:08:42 compute-0 sudo[30232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:42 compute-0 python3[30234]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:42 compute-0 sudo[30232]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:42 compute-0 sudo[30305]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phcecdjgfplbaosupptllodwdatcvhfm ; /usr/bin/python3'
Dec 02 23:08:42 compute-0 sudo[30305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:43 compute-0 python3[30307]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4423504-33763-190729976065255/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=c22157e85d05af7ffbafa054f80958446d397a41 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:43 compute-0 sudo[30305]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:43 compute-0 sudo[30331]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plxtrgrhafihukeazgqwdrrosjwhznby ; /usr/bin/python3'
Dec 02 23:08:43 compute-0 sudo[30331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:43 compute-0 python3[30333]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:43 compute-0 sudo[30331]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:43 compute-0 sudo[30404]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsfrktrovecipkonoanjkpzvirmdksed ; /usr/bin/python3'
Dec 02 23:08:43 compute-0 sudo[30404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:43 compute-0 python3[30406]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4423504-33763-190729976065255/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:43 compute-0 sudo[30404]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:43 compute-0 sudo[30430]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sztwgzpvrzgwfanniqyrurqoimmrwizn ; /usr/bin/python3'
Dec 02 23:08:43 compute-0 sudo[30430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:43 compute-0 python3[30432]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:44 compute-0 sudo[30430]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:44 compute-0 sudo[30503]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huicxqgmlclvrwkzmpesnhvdeeiaikrj ; /usr/bin/python3'
Dec 02 23:08:44 compute-0 sudo[30503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:44 compute-0 python3[30505]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4423504-33763-190729976065255/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:44 compute-0 sudo[30503]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:44 compute-0 sudo[30529]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnwlecsrllegasjzzjndoaknrwovqcdu ; /usr/bin/python3'
Dec 02 23:08:44 compute-0 sudo[30529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:44 compute-0 python3[30531]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:44 compute-0 sudo[30529]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:45 compute-0 sudo[30602]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xezeqobemvtujxwwrdkvkqbguawwlncn ; /usr/bin/python3'
Dec 02 23:08:45 compute-0 sudo[30602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:45 compute-0 python3[30604]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4423504-33763-190729976065255/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:45 compute-0 sudo[30602]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:45 compute-0 sudo[30628]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmeyrelgivdghcagtiuxhjljwexlstqj ; /usr/bin/python3'
Dec 02 23:08:45 compute-0 sudo[30628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:45 compute-0 python3[30630]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:45 compute-0 sudo[30628]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:45 compute-0 sudo[30701]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyxgrktelkphwvefylhxkvtmisdlfolx ; /usr/bin/python3'
Dec 02 23:08:45 compute-0 sudo[30701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:45 compute-0 python3[30703]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4423504-33763-190729976065255/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:45 compute-0 sudo[30701]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:46 compute-0 sudo[30727]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fawxcaaosmlvxnpaxakcvmsxunvxclch ; /usr/bin/python3'
Dec 02 23:08:46 compute-0 sudo[30727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:46 compute-0 python3[30729]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:08:46 compute-0 sudo[30727]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:46 compute-0 sudo[30800]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqaplafjpdqssqsnylbljqqbxfngiiuz ; /usr/bin/python3'
Dec 02 23:08:46 compute-0 sudo[30800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:08:46 compute-0 python3[30802]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764716921.4423504-33763-190729976065255/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=fa2c662325f345c065cf09a4d87ff5b21ab5eb35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:08:46 compute-0 sudo[30800]: pam_unix(sudo:session): session closed for user root
Dec 02 23:08:49 compute-0 sshd-session[30827]: Connection closed by 192.168.122.11 port 58114 [preauth]
Dec 02 23:08:49 compute-0 sshd-session[30829]: Connection closed by 192.168.122.11 port 58108 [preauth]
Dec 02 23:08:49 compute-0 sshd-session[30830]: Unable to negotiate with 192.168.122.11 port 58126: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 02 23:08:49 compute-0 sshd-session[30831]: Unable to negotiate with 192.168.122.11 port 58134: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 02 23:08:49 compute-0 sshd-session[30828]: Unable to negotiate with 192.168.122.11 port 58136: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 02 23:09:38 compute-0 sshd-session[30837]: Connection closed by authenticating user root 80.94.95.116 port 33434 [preauth]
Dec 02 23:09:44 compute-0 python3[30862]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:12:08 compute-0 sshd-session[30864]: Invalid user devuser from 45.78.218.154 port 54282
Dec 02 23:12:09 compute-0 sshd-session[30864]: Received disconnect from 45.78.218.154 port 54282:11: Bye Bye [preauth]
Dec 02 23:12:09 compute-0 sshd-session[30864]: Disconnected from invalid user devuser 45.78.218.154 port 54282 [preauth]
Dec 02 23:14:32 compute-0 sshd-session[30867]: Received disconnect from 45.78.218.154 port 53794:11: Bye Bye [preauth]
Dec 02 23:14:32 compute-0 sshd-session[30867]: Disconnected from authenticating user root 45.78.218.154 port 53794 [preauth]
Dec 02 23:14:44 compute-0 sshd-session[29946]: Received disconnect from 38.102.83.66 port 36072:11: disconnected by user
Dec 02 23:14:44 compute-0 sshd-session[29946]: Disconnected from user zuul 38.102.83.66 port 36072
Dec 02 23:14:44 compute-0 sshd-session[29943]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:14:44 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Dec 02 23:14:44 compute-0 systemd[1]: session-7.scope: Consumed 5.991s CPU time.
Dec 02 23:14:44 compute-0 systemd-logind[795]: Session 7 logged out. Waiting for processes to exit.
Dec 02 23:14:44 compute-0 systemd-logind[795]: Removed session 7.
Dec 02 23:16:38 compute-0 systemd[1]: Starting dnf makecache...
Dec 02 23:16:38 compute-0 dnf[30870]: Failed determining last makecache time.
Dec 02 23:16:42 compute-0 dnf[30870]: delorean-python-castellan-609f4ea667df386849930 3.1 kB/s |  13 kB     00:04
Dec 02 23:16:42 compute-0 dnf[30870]: delorean-openstack-ironic-c525a16b06266b6b474c9 1.5 MB/s |  64 kB     00:00
Dec 02 23:16:43 compute-0 dnf[30870]: delorean-openstack-cinder-92c645f1f1e913b5b1cd8  25 kB/s |  30 kB     00:01
Dec 02 23:16:44 compute-0 dnf[30870]: delorean-ansible-collections-openstack-f584c54d 184 kB/s | 121 kB     00:00
Dec 02 23:16:44 compute-0 dnf[30870]: delorean-openstack-ceilometer-60803e710e7f5b3cd 632 kB/s |  24 kB     00:00
Dec 02 23:16:52 compute-0 dnf[30870]: delorean-openstack-kolla-e7bd46dad0b62ff151667b  36 kB/s | 274 kB     00:07
Dec 02 23:16:52 compute-0 dnf[30870]: delorean-openstack-nova-3e7017eb2952d5258d96e27 150 kB/s |  37 kB     00:00
Dec 02 23:16:52 compute-0 dnf[30870]: delorean-openstack-designate-82652559ea8641b11c 156 kB/s |  19 kB     00:00
Dec 02 23:16:52 compute-0 dnf[30870]: delorean-openstack-glance-e055873be4079bc9d3716 485 kB/s |  19 kB     00:00
Dec 02 23:16:52 compute-0 dnf[30870]: delorean-openstack-keystone-4f1b7e96e38463d5fcd 649 kB/s |  23 kB     00:00
Dec 02 23:16:52 compute-0 dnf[30870]: delorean-openstack-manila-70623bb84e7880f7f2f75 614 kB/s |  27 kB     00:00
Dec 02 23:16:52 compute-0 dnf[30870]: delorean-python-networking-mlnx-7139a7f0bce9d6a 3.7 MB/s | 130 kB     00:00
Dec 02 23:16:53 compute-0 dnf[30870]: delorean-openstack-octavia-e981d3e172b8e4471f97 260 kB/s |  25 kB     00:00
Dec 02 23:16:53 compute-0 dnf[30870]: delorean-openstack-watcher-71470dac73abba9e5dcf 165 kB/s |  17 kB     00:00
Dec 02 23:16:53 compute-0 dnf[30870]: delorean-ansible-config_template-5ccaa22121a7ff 265 kB/s | 7.9 kB     00:00
Dec 02 23:16:53 compute-0 sshd-session[30888]: Received disconnect from 45.78.218.154 port 59858:11: Bye Bye [preauth]
Dec 02 23:16:53 compute-0 sshd-session[30888]: Disconnected from authenticating user root 45.78.218.154 port 59858 [preauth]
Dec 02 23:16:53 compute-0 dnf[30870]: delorean-puppet-magnum-ec92e647ad5e77720f01cce0 3.6 MB/s | 155 kB     00:00
Dec 02 23:16:53 compute-0 dnf[30870]: delorean-openstack-swift-e10c2bafcb8fc80929bce3 173 kB/s |  15 kB     00:00
Dec 02 23:16:53 compute-0 dnf[30870]: delorean-python-mistral-tests-tempest-900580c95 927 kB/s |  35 kB     00:00
Dec 02 23:16:53 compute-0 dnf[30870]: delorean-python-django-horizon-915b939b342dc65f 2.7 MB/s | 105 kB     00:00
Dec 02 23:16:53 compute-0 dnf[30870]: CentOS Stream 9 - BaseOS                         41 kB/s | 5.9 kB     00:00
Dec 02 23:16:54 compute-0 dnf[30870]: CentOS Stream 9 - AppStream                      26 kB/s | 6.0 kB     00:00
Dec 02 23:16:54 compute-0 dnf[30870]: CentOS Stream 9 - CRB                            65 kB/s | 5.8 kB     00:00
Dec 02 23:16:54 compute-0 dnf[30870]: CentOS Stream 9 - Extras packages                27 kB/s | 8.3 kB     00:00
Dec 02 23:16:54 compute-0 dnf[30870]: dlrn-master-testing                              40 MB/s | 2.4 MB     00:00
Dec 02 23:16:57 compute-0 dnf[30870]: dlrn-master-build-deps                          232 kB/s | 516 kB     00:02
Dec 02 23:16:57 compute-0 dnf[30870]: centos9-rabbitmq                                8.3 MB/s | 123 kB     00:00
Dec 02 23:16:57 compute-0 dnf[30870]: centos9-storage                                  30 MB/s | 415 kB     00:00
Dec 02 23:16:58 compute-0 dnf[30870]: centos9-opstools                                3.9 MB/s |  51 kB     00:00
Dec 02 23:16:58 compute-0 dnf[30870]: NFV SIG OpenvSwitch                              21 MB/s | 456 kB     00:00
Dec 02 23:16:58 compute-0 dnf[30870]: repo-setup-centos-appstream                      61 MB/s |  25 MB     00:00
Dec 02 23:17:04 compute-0 dnf[30870]: repo-setup-centos-baseos                         67 MB/s | 8.8 MB     00:00
Dec 02 23:17:05 compute-0 dnf[30870]: repo-setup-centos-highavailability               32 MB/s | 744 kB     00:00
Dec 02 23:17:06 compute-0 dnf[30870]: repo-setup-centos-powertools                     82 MB/s | 7.3 MB     00:00
Dec 02 23:17:09 compute-0 dnf[30870]: Extra Packages for Enterprise Linux 9 - x86_64   12 MB/s |  20 MB     00:01
Dec 02 23:17:22 compute-0 dnf[30870]: Metadata cache created.
Dec 02 23:17:22 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 02 23:17:22 compute-0 systemd[1]: Finished dnf makecache.
Dec 02 23:17:22 compute-0 systemd[1]: dnf-makecache.service: Consumed 24.665s CPU time.
Dec 02 23:19:11 compute-0 sshd-session[30975]: Invalid user zjw from 45.78.218.154 port 35246
Dec 02 23:19:20 compute-0 sshd-session[30975]: Received disconnect from 45.78.218.154 port 35246:11: Bye Bye [preauth]
Dec 02 23:19:20 compute-0 sshd-session[30975]: Disconnected from invalid user zjw 45.78.218.154 port 35246 [preauth]
Dec 02 23:21:33 compute-0 sshd-session[30978]: Invalid user marco from 45.78.218.154 port 44564
Dec 02 23:21:33 compute-0 sshd-session[30978]: Received disconnect from 45.78.218.154 port 44564:11: Bye Bye [preauth]
Dec 02 23:21:33 compute-0 sshd-session[30978]: Disconnected from invalid user marco 45.78.218.154 port 44564 [preauth]
Dec 02 23:21:41 compute-0 sshd-session[30980]: Invalid user ubnt from 80.94.95.116 port 45110
Dec 02 23:21:42 compute-0 sshd-session[30980]: Connection closed by invalid user ubnt 80.94.95.116 port 45110 [preauth]
Dec 02 23:21:43 compute-0 sshd-session[30982]: Accepted publickey for zuul from 192.168.122.30 port 56740 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:21:43 compute-0 systemd-logind[795]: New session 8 of user zuul.
Dec 02 23:21:43 compute-0 systemd[1]: Started Session 8 of User zuul.
Dec 02 23:21:43 compute-0 sshd-session[30982]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:21:44 compute-0 python3.9[31135]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:21:45 compute-0 sudo[31314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nghjrtijrcrdhbsjnetxmjhqtgbdqrxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717705.3172002-44-127573057706797/AnsiballZ_command.py'
Dec 02 23:21:45 compute-0 sudo[31314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:21:45 compute-0 python3.9[31316]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:21:53 compute-0 sudo[31314]: pam_unix(sudo:session): session closed for user root
Dec 02 23:21:53 compute-0 sshd-session[30985]: Connection closed by 192.168.122.30 port 56740
Dec 02 23:21:53 compute-0 sshd-session[30982]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:21:53 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Dec 02 23:21:53 compute-0 systemd[1]: session-8.scope: Consumed 7.729s CPU time.
Dec 02 23:21:53 compute-0 systemd-logind[795]: Session 8 logged out. Waiting for processes to exit.
Dec 02 23:21:53 compute-0 systemd-logind[795]: Removed session 8.
Dec 02 23:21:58 compute-0 sshd-session[31374]: Accepted publickey for zuul from 192.168.122.30 port 45562 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:21:58 compute-0 systemd-logind[795]: New session 9 of user zuul.
Dec 02 23:21:58 compute-0 systemd[1]: Started Session 9 of User zuul.
Dec 02 23:21:58 compute-0 sshd-session[31374]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:21:59 compute-0 python3.9[31527]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:22:00 compute-0 sshd-session[31377]: Connection closed by 192.168.122.30 port 45562
Dec 02 23:22:00 compute-0 sshd-session[31374]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:22:00 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Dec 02 23:22:00 compute-0 systemd-logind[795]: Session 9 logged out. Waiting for processes to exit.
Dec 02 23:22:00 compute-0 systemd-logind[795]: Removed session 9.
Dec 02 23:22:16 compute-0 sshd-session[31555]: Accepted publickey for zuul from 192.168.122.30 port 58574 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:22:16 compute-0 systemd-logind[795]: New session 10 of user zuul.
Dec 02 23:22:16 compute-0 systemd[1]: Started Session 10 of User zuul.
Dec 02 23:22:16 compute-0 sshd-session[31555]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:22:16 compute-0 python3.9[31708]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 02 23:22:18 compute-0 python3.9[31882]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:22:19 compute-0 sudo[32032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajgvhtvkmsyxzczghtnsdcptyccmbeke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717738.5507762-69-29303144661052/AnsiballZ_command.py'
Dec 02 23:22:19 compute-0 sudo[32032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:19 compute-0 python3.9[32034]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:22:19 compute-0 sudo[32032]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:20 compute-0 sudo[32185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykzztwzmsygwuksttmzculfdjpbncywu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717739.6675382-93-163775337332130/AnsiballZ_stat.py'
Dec 02 23:22:20 compute-0 sudo[32185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:20 compute-0 python3.9[32187]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:22:20 compute-0 sudo[32185]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:21 compute-0 sudo[32337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyggvtkwaptqymcwwrmguhwstlomjaeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717740.528309-109-35502808726337/AnsiballZ_file.py'
Dec 02 23:22:21 compute-0 sudo[32337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:21 compute-0 python3.9[32339]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:22:21 compute-0 sudo[32337]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:21 compute-0 sudo[32489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruucyzcyrrlvieulrsvpjpscjijtzzkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717741.4108496-125-259013633287620/AnsiballZ_stat.py'
Dec 02 23:22:21 compute-0 sudo[32489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:21 compute-0 python3.9[32491]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:22:21 compute-0 sudo[32489]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:22 compute-0 sudo[32612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhixxizjoubutajmcsnnugbkadwqqgcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717741.4108496-125-259013633287620/AnsiballZ_copy.py'
Dec 02 23:22:22 compute-0 sudo[32612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:22 compute-0 python3.9[32614]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764717741.4108496-125-259013633287620/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:22:22 compute-0 sudo[32612]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:23 compute-0 sudo[32764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypheczqvkncehyaxtywajpntdfbdpxbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717742.8814902-155-192335129528486/AnsiballZ_setup.py'
Dec 02 23:22:23 compute-0 sudo[32764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:23 compute-0 python3.9[32766]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:22:23 compute-0 sudo[32764]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:24 compute-0 sudo[32920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlrrihpepyxwtwqcnocwiukojnnlomtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717743.922154-171-42510368282627/AnsiballZ_file.py'
Dec 02 23:22:24 compute-0 sudo[32920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:24 compute-0 python3.9[32922]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:22:24 compute-0 sudo[32920]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:25 compute-0 sudo[33072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpwspjorgwuehlvugvmweduzmvpxmfxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717744.7878876-189-240664766138965/AnsiballZ_file.py'
Dec 02 23:22:25 compute-0 sudo[33072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:25 compute-0 python3.9[33074]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:22:25 compute-0 sudo[33072]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:26 compute-0 python3.9[33224]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:22:30 compute-0 python3.9[33478]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:22:32 compute-0 python3.9[33628]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:22:33 compute-0 python3.9[33782]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:22:34 compute-0 sudo[33938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzqhnkznvbrksclhovdltoztpuxsnazc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717754.0490062-285-58552830725172/AnsiballZ_setup.py'
Dec 02 23:22:34 compute-0 sudo[33938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:34 compute-0 python3.9[33940]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:22:35 compute-0 sudo[33938]: pam_unix(sudo:session): session closed for user root
Dec 02 23:22:35 compute-0 sudo[34023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rltjlwyrkhtrxezhkiwrzjhfulnuayxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717754.0490062-285-58552830725172/AnsiballZ_dnf.py'
Dec 02 23:22:35 compute-0 sudo[34023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:22:35 compute-0 python3.9[34025]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:23:16 compute-0 systemd[1]: Reloading.
Dec 02 23:23:17 compute-0 systemd-rc-local-generator[34223]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:23:17 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 02 23:23:17 compute-0 systemd[1]: Reloading.
Dec 02 23:23:17 compute-0 systemd-rc-local-generator[34258]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:23:17 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 02 23:23:17 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 02 23:23:17 compute-0 systemd[1]: Reloading.
Dec 02 23:23:17 compute-0 systemd-rc-local-generator[34303]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:23:18 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Dec 02 23:23:18 compute-0 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Dec 02 23:23:18 compute-0 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Dec 02 23:23:18 compute-0 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Dec 02 23:24:20 compute-0 kernel: SELinux:  Converting 2719 SID table entries...
Dec 02 23:24:20 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:24:20 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 02 23:24:20 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:24:20 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:24:20 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:24:20 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:24:20 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:24:20 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 02 23:24:21 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:24:21 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:24:21 compute-0 systemd[1]: Reloading.
Dec 02 23:24:21 compute-0 systemd-rc-local-generator[34637]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:24:21 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:24:21 compute-0 sudo[34023]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:22 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:24:22 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:24:22 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.189s CPU time.
Dec 02 23:24:22 compute-0 systemd[1]: run-rfb17e64b1b014506a21aab35fd6228bd.service: Deactivated successfully.
Dec 02 23:24:22 compute-0 sudo[35556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wchbmrbswrfcqjkenibwiohwefbczxjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717862.0335193-309-165183542278974/AnsiballZ_command.py'
Dec 02 23:24:22 compute-0 sudo[35556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:22 compute-0 python3.9[35558]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:24:23 compute-0 sudo[35556]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:24 compute-0 sudo[35837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsdgxtqlayxjbefvfdgbphmvomkxsadr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717863.547387-325-180932015150563/AnsiballZ_selinux.py'
Dec 02 23:24:24 compute-0 sudo[35837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:24 compute-0 python3.9[35839]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 02 23:24:24 compute-0 sudo[35837]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:25 compute-0 sudo[35989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrsugjjdohewrkdpfzuwxrlwnocgdebt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717864.89812-347-63502319804428/AnsiballZ_command.py'
Dec 02 23:24:25 compute-0 sudo[35989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:25 compute-0 python3.9[35991]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 02 23:24:26 compute-0 sudo[35989]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:26 compute-0 sudo[36143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plwbmgdlrmlodbsjitrmpvsguzdvcsus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717866.700704-363-114754629608277/AnsiballZ_file.py'
Dec 02 23:24:26 compute-0 sudo[36143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:28 compute-0 python3.9[36145]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:24:28 compute-0 sudo[36143]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:29 compute-0 sudo[36295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcrxawftjondjtatzuwkfyjswxlzcznr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717868.5561576-379-261552449255056/AnsiballZ_mount.py'
Dec 02 23:24:29 compute-0 sudo[36295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:29 compute-0 python3.9[36297]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 02 23:24:29 compute-0 sudo[36295]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:30 compute-0 sudo[36447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spaluyaulioifhaptbphfisjdscvwhvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717870.2897108-435-176390053672348/AnsiballZ_file.py'
Dec 02 23:24:30 compute-0 sudo[36447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:30 compute-0 python3.9[36449]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:24:30 compute-0 sudo[36447]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:31 compute-0 sudo[36599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drvgyehbdsbzdfssdkaffjqpfqepxhgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717871.1613982-451-254907012185481/AnsiballZ_stat.py'
Dec 02 23:24:31 compute-0 sudo[36599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:34 compute-0 python3.9[36601]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:24:34 compute-0 sudo[36599]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:34 compute-0 sudo[36722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgwtoxzylrxzxyrveogwiaezxcrrwpqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717871.1613982-451-254907012185481/AnsiballZ_copy.py'
Dec 02 23:24:34 compute-0 sudo[36722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:37 compute-0 python3.9[36724]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764717871.1613982-451-254907012185481/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:24:37 compute-0 sudo[36722]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:38 compute-0 sudo[36874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nerhxbdxfvsryfyvxmvhpyxcvhdolovc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717878.3440886-499-232747395498241/AnsiballZ_stat.py'
Dec 02 23:24:38 compute-0 sudo[36874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:38 compute-0 python3.9[36876]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:24:38 compute-0 sudo[36874]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:40 compute-0 sudo[37026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emfzzjdegpkeofusnmxbybuwwtynauur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717880.5989263-515-264480545642141/AnsiballZ_command.py'
Dec 02 23:24:40 compute-0 sudo[37026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:41 compute-0 python3.9[37028]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:24:41 compute-0 sudo[37026]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:41 compute-0 sudo[37179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osfcxzbffeudkthgjmwkulsnuaxsrxms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717881.3997483-531-173380471947821/AnsiballZ_file.py'
Dec 02 23:24:41 compute-0 sudo[37179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:41 compute-0 python3.9[37181]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:24:41 compute-0 sudo[37179]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:42 compute-0 sudo[37331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxoquohrjfyrcujltjoxofplonjbzveh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717882.4120502-553-129291677060946/AnsiballZ_getent.py'
Dec 02 23:24:42 compute-0 sudo[37331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:43 compute-0 python3.9[37333]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 02 23:24:43 compute-0 sudo[37331]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:43 compute-0 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:24:43 compute-0 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:24:43 compute-0 sudo[37485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpbamxywbgmoyhmknhhmoqnxumvohyxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717883.2952874-569-20747346328057/AnsiballZ_group.py'
Dec 02 23:24:43 compute-0 sudo[37485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:43 compute-0 python3.9[37487]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 23:24:44 compute-0 groupadd[37488]: group added to /etc/group: name=qemu, GID=107
Dec 02 23:24:44 compute-0 groupadd[37488]: group added to /etc/gshadow: name=qemu
Dec 02 23:24:44 compute-0 groupadd[37488]: new group: name=qemu, GID=107
Dec 02 23:24:44 compute-0 sudo[37485]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:44 compute-0 sudo[37643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axjzhywqdkeiqxcnmjqmghlwlwqfrwkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717884.2771652-585-125309219652298/AnsiballZ_user.py'
Dec 02 23:24:44 compute-0 sudo[37643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:44 compute-0 python3.9[37645]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 02 23:24:45 compute-0 useradd[37647]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Dec 02 23:24:45 compute-0 sudo[37643]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:45 compute-0 sudo[37803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvuillzxvgtigsvtvpzlmdsuxhrtnlzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717885.366361-601-164351056113483/AnsiballZ_getent.py'
Dec 02 23:24:45 compute-0 sudo[37803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:45 compute-0 python3.9[37805]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 02 23:24:45 compute-0 sudo[37803]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:46 compute-0 sudo[37956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iksicxuojbgllkifpxmfclkuzspzolzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717886.1825564-617-216920506050271/AnsiballZ_group.py'
Dec 02 23:24:46 compute-0 sudo[37956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:46 compute-0 python3.9[37958]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 23:24:46 compute-0 groupadd[37959]: group added to /etc/group: name=hugetlbfs, GID=42477
Dec 02 23:24:46 compute-0 groupadd[37959]: group added to /etc/gshadow: name=hugetlbfs
Dec 02 23:24:46 compute-0 groupadd[37959]: new group: name=hugetlbfs, GID=42477
Dec 02 23:24:46 compute-0 sudo[37956]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:47 compute-0 sudo[38114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqxufbnopsroklephlogwvsmfvxvgtwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717887.054868-635-174026899190802/AnsiballZ_file.py'
Dec 02 23:24:47 compute-0 sudo[38114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:47 compute-0 python3.9[38116]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 02 23:24:47 compute-0 sudo[38114]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:48 compute-0 sudo[38266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quptonyfqadkbdjavcgomsduogxsdrzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717888.048987-657-187588811789135/AnsiballZ_dnf.py'
Dec 02 23:24:48 compute-0 sudo[38266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:48 compute-0 python3.9[38268]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:24:50 compute-0 sudo[38266]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:50 compute-0 sudo[38419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbrmcpxzjwhecebmioecziurqxvoeqyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717890.5422447-673-60050325606181/AnsiballZ_file.py'
Dec 02 23:24:50 compute-0 sudo[38419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:51 compute-0 python3.9[38421]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:24:51 compute-0 sudo[38419]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:51 compute-0 sudo[38571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixnmwcyntflkucfwdlocvuoazpfizxsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717891.3134146-689-269245077961303/AnsiballZ_stat.py'
Dec 02 23:24:51 compute-0 sudo[38571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:51 compute-0 python3.9[38573]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:24:51 compute-0 sudo[38571]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:52 compute-0 sudo[38694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-divmildlkvxdgqapsihfmsfoswovdigi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717891.3134146-689-269245077961303/AnsiballZ_copy.py'
Dec 02 23:24:52 compute-0 sudo[38694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:52 compute-0 python3.9[38696]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764717891.3134146-689-269245077961303/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:24:52 compute-0 sudo[38694]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:53 compute-0 sudo[38846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncmxjogwvuqeazeyukpyoopgnurelgrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717892.677217-719-65475884850819/AnsiballZ_systemd.py'
Dec 02 23:24:53 compute-0 sudo[38846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:53 compute-0 python3.9[38848]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:24:53 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 02 23:24:53 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 02 23:24:53 compute-0 kernel: Bridge firewalling registered
Dec 02 23:24:53 compute-0 systemd-modules-load[38852]: Inserted module 'br_netfilter'
Dec 02 23:24:53 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 02 23:24:53 compute-0 sudo[38846]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:54 compute-0 sudo[39006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sluwposzstcniogswkzdxwjwbrmxlmmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717893.961528-735-66543373903734/AnsiballZ_stat.py'
Dec 02 23:24:54 compute-0 sudo[39006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:54 compute-0 python3.9[39008]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:24:54 compute-0 sudo[39006]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:54 compute-0 sudo[39129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efxlialcicybfhjjpjoywdocifqdubqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717893.961528-735-66543373903734/AnsiballZ_copy.py'
Dec 02 23:24:54 compute-0 sudo[39129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:54 compute-0 python3.9[39131]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764717893.961528-735-66543373903734/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:24:54 compute-0 sudo[39129]: pam_unix(sudo:session): session closed for user root
Dec 02 23:24:56 compute-0 sudo[39281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsuutzugttszjwjbosfkquhalcvetccm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717895.8002708-771-223187240846267/AnsiballZ_dnf.py'
Dec 02 23:24:56 compute-0 sudo[39281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:24:56 compute-0 python3.9[39283]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:24:59 compute-0 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Dec 02 23:24:59 compute-0 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Dec 02 23:24:59 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:24:59 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:24:59 compute-0 systemd[1]: Reloading.
Dec 02 23:24:59 compute-0 systemd-rc-local-generator[39347]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:25:00 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:25:01 compute-0 sudo[39281]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:01 compute-0 python3.9[41462]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:25:02 compute-0 python3.9[42470]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 02 23:25:03 compute-0 python3.9[43134]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:25:03 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:25:03 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:25:03 compute-0 systemd[1]: man-db-cache-update.service: Consumed 5.053s CPU time.
Dec 02 23:25:03 compute-0 systemd[1]: run-r01cab96635c34d1186c66194b961b519.service: Deactivated successfully.
Dec 02 23:25:04 compute-0 sudo[43501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdoygjdcnkxpdjgmgvkdxgyxjsajidmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717903.9949932-849-75661865622066/AnsiballZ_command.py'
Dec 02 23:25:04 compute-0 sudo[43501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:04 compute-0 python3.9[43503]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:04 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 02 23:25:05 compute-0 systemd[1]: Starting Authorization Manager...
Dec 02 23:25:05 compute-0 polkitd[43720]: Started polkitd version 0.117
Dec 02 23:25:05 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 02 23:25:05 compute-0 polkitd[43720]: Loading rules from directory /etc/polkit-1/rules.d
Dec 02 23:25:05 compute-0 polkitd[43720]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 02 23:25:05 compute-0 polkitd[43720]: Finished loading, compiling and executing 2 rules
Dec 02 23:25:05 compute-0 polkitd[43720]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 02 23:25:05 compute-0 systemd[1]: Started Authorization Manager.
Dec 02 23:25:05 compute-0 sudo[43501]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:05 compute-0 sudo[43888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayqiyuhkzwclxgvqfvmlrsqadrdvbozs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717905.5356815-867-82718177158608/AnsiballZ_systemd.py'
Dec 02 23:25:05 compute-0 sudo[43888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:06 compute-0 python3.9[43890]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:25:06 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 02 23:25:06 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Dec 02 23:25:06 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 02 23:25:06 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 02 23:25:06 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 02 23:25:06 compute-0 sudo[43888]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:07 compute-0 python3.9[44053]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 02 23:25:10 compute-0 sudo[44203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnwygmynrrjkvanibrrzjzbjqhujghrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717910.2367048-981-239476134044140/AnsiballZ_systemd.py'
Dec 02 23:25:10 compute-0 sudo[44203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:10 compute-0 python3.9[44205]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:25:10 compute-0 systemd[1]: Reloading.
Dec 02 23:25:11 compute-0 systemd-rc-local-generator[44236]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:25:11 compute-0 sudo[44203]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:11 compute-0 sudo[44393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svfoygettadouezyxsstvyqlumuxxsoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717911.4075882-981-246647715195647/AnsiballZ_systemd.py'
Dec 02 23:25:11 compute-0 sudo[44393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:12 compute-0 python3.9[44395]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:25:12 compute-0 systemd[1]: Reloading.
Dec 02 23:25:12 compute-0 systemd-rc-local-generator[44420]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:25:12 compute-0 sudo[44393]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:12 compute-0 sudo[44582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wawgyiydcmggtvhvilkftmjtswiegtbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717912.6420496-1013-32715027087060/AnsiballZ_command.py'
Dec 02 23:25:12 compute-0 sudo[44582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:13 compute-0 python3.9[44584]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:13 compute-0 sudo[44582]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:13 compute-0 sudo[44735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnyelyjxstmvkdvdmetrqclzhfoqogyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717913.4562407-1029-174701450066651/AnsiballZ_command.py'
Dec 02 23:25:13 compute-0 sudo[44735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:13 compute-0 python3.9[44737]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:13 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec 02 23:25:13 compute-0 sudo[44735]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:14 compute-0 sudo[44888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpduzpwswwvehsdnlzhlvkotuuwbisgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717914.3022623-1045-243158709697492/AnsiballZ_command.py'
Dec 02 23:25:14 compute-0 sudo[44888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:14 compute-0 python3.9[44890]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:16 compute-0 sudo[44888]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:16 compute-0 sudo[45050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzwqlmwbgkzgioucekqqcdssbgsgstez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717916.6181743-1061-55751055065953/AnsiballZ_command.py'
Dec 02 23:25:16 compute-0 sudo[45050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:17 compute-0 python3.9[45052]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:17 compute-0 sudo[45050]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:17 compute-0 sudo[45203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsjdqvlfxvrdcfzitrnzgbjqnjtecush ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717917.3260376-1077-88453865789437/AnsiballZ_systemd.py'
Dec 02 23:25:17 compute-0 sudo[45203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:17 compute-0 python3.9[45205]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:25:18 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 02 23:25:18 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Dec 02 23:25:18 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Dec 02 23:25:18 compute-0 systemd[1]: Starting Apply Kernel Variables...
Dec 02 23:25:18 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 02 23:25:18 compute-0 systemd[1]: Finished Apply Kernel Variables.
Dec 02 23:25:18 compute-0 sudo[45203]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:18 compute-0 sshd-session[31558]: Connection closed by 192.168.122.30 port 58574
Dec 02 23:25:18 compute-0 sshd-session[31555]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:25:18 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Dec 02 23:25:18 compute-0 systemd[1]: session-10.scope: Consumed 2min 11.485s CPU time.
Dec 02 23:25:18 compute-0 systemd-logind[795]: Session 10 logged out. Waiting for processes to exit.
Dec 02 23:25:18 compute-0 systemd-logind[795]: Removed session 10.
Dec 02 23:25:23 compute-0 sshd-session[45235]: Accepted publickey for zuul from 192.168.122.30 port 56358 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:25:23 compute-0 systemd-logind[795]: New session 11 of user zuul.
Dec 02 23:25:23 compute-0 systemd[1]: Started Session 11 of User zuul.
Dec 02 23:25:23 compute-0 sshd-session[45235]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:25:24 compute-0 python3.9[45388]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:25:25 compute-0 python3.9[45542]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:25:26 compute-0 sudo[45696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rraufpagdlxejudvznwrafkfhyjsyodm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717926.4805322-80-74388791640431/AnsiballZ_command.py'
Dec 02 23:25:26 compute-0 sudo[45696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:27 compute-0 python3.9[45698]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:27 compute-0 sudo[45696]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:28 compute-0 python3.9[45849]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:25:29 compute-0 sudo[46003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bippokhhhwabqftahsqphzurqchgofnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717928.7564733-120-46846437015064/AnsiballZ_setup.py'
Dec 02 23:25:29 compute-0 sudo[46003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:29 compute-0 python3.9[46005]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:25:29 compute-0 sudo[46003]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:30 compute-0 sudo[46087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugsiqrwdebowaznzjhlenuzslgncwegj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717928.7564733-120-46846437015064/AnsiballZ_dnf.py'
Dec 02 23:25:30 compute-0 sudo[46087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:30 compute-0 python3.9[46089]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:25:31 compute-0 sudo[46087]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:32 compute-0 sudo[46240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxekhufmsfkplofilcilhkajapmvbwem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717931.8084176-144-213423083341566/AnsiballZ_setup.py'
Dec 02 23:25:32 compute-0 sudo[46240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:32 compute-0 python3.9[46242]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:25:32 compute-0 sudo[46240]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:33 compute-0 sudo[46411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmnvpnuiykdfpkysfhxkkfvlsvlbxtom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717932.9910307-166-64424357857393/AnsiballZ_file.py'
Dec 02 23:25:33 compute-0 sudo[46411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:33 compute-0 python3.9[46413]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:25:33 compute-0 sudo[46411]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:34 compute-0 sudo[46563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tijlbfxhcvgbuxlviyhafnjrirrqycil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717933.9091172-182-248690473347497/AnsiballZ_command.py'
Dec 02 23:25:34 compute-0 sudo[46563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:34 compute-0 python3.9[46565]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:25:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2117349558-merged.mount: Deactivated successfully.
Dec 02 23:25:34 compute-0 podman[46566]: 2025-12-02 23:25:34.577994866 +0000 UTC m=+0.186523819 system refresh
Dec 02 23:25:34 compute-0 sudo[46563]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:35 compute-0 sudo[46726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ficxatcxjzbxaeyaskjrvoqatownekjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717934.900762-198-15960333162183/AnsiballZ_stat.py'
Dec 02 23:25:35 compute-0 sudo[46726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:35 compute-0 python3.9[46728]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:25:35 compute-0 sudo[46726]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:25:36 compute-0 sudo[46849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuxglpkmvieukdqxrgbwctyvpoarnyrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717934.900762-198-15960333162183/AnsiballZ_copy.py'
Dec 02 23:25:36 compute-0 sudo[46849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:36 compute-0 python3.9[46851]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764717934.900762-198-15960333162183/.source.json follow=False _original_basename=podman_network_config.j2 checksum=32a6b2bd728b257c2f80e5a422063e4d5a986af5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:25:36 compute-0 sudo[46849]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:36 compute-0 sudo[47001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crrthgeuvnrwcogoufmksbszysmcwutz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717936.4344027-228-188111858860158/AnsiballZ_stat.py'
Dec 02 23:25:36 compute-0 sudo[47001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:37 compute-0 python3.9[47003]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:25:37 compute-0 sudo[47001]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:37 compute-0 sudo[47124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwyhtozydrqdfmtykatzhfwxjuxfpauv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717936.4344027-228-188111858860158/AnsiballZ_copy.py'
Dec 02 23:25:37 compute-0 sudo[47124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:37 compute-0 python3.9[47126]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764717936.4344027-228-188111858860158/.source.conf follow=False _original_basename=registries.conf.j2 checksum=51dca2f6e7d675b0597f23a4e044edd3f4faff03 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:25:37 compute-0 sudo[47124]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:38 compute-0 sudo[47276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orjyxkywerymtabvgurcrwlrjbtliopu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717937.8427165-260-8839243384074/AnsiballZ_ini_file.py'
Dec 02 23:25:38 compute-0 sudo[47276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:38 compute-0 python3.9[47278]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:25:38 compute-0 sudo[47276]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:38 compute-0 sudo[47428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sitlchvgxkowkhowalteebbexaswewsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717938.5633621-260-78551595019879/AnsiballZ_ini_file.py'
Dec 02 23:25:38 compute-0 sudo[47428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:38 compute-0 python3.9[47430]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:25:39 compute-0 sudo[47428]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:39 compute-0 sudo[47580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npcavwkdkhuayysrrrgddiesdaniyrht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717939.1397448-260-7277343924497/AnsiballZ_ini_file.py'
Dec 02 23:25:39 compute-0 sudo[47580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:39 compute-0 python3.9[47582]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:25:39 compute-0 sudo[47580]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:40 compute-0 sudo[47732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxqoxenupcsjsqsjibfmnacelbhuexqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717939.7781632-260-277833257127443/AnsiballZ_ini_file.py'
Dec 02 23:25:40 compute-0 sudo[47732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:40 compute-0 python3.9[47734]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:25:40 compute-0 sudo[47732]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:41 compute-0 python3.9[47884]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:25:42 compute-0 sudo[48036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mevotkqeqspqkzrlmhdcylaihixvpaop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717941.9270782-340-239894926437732/AnsiballZ_dnf.py'
Dec 02 23:25:42 compute-0 sudo[48036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:42 compute-0 python3.9[48038]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:43 compute-0 sudo[48036]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:44 compute-0 sudo[48189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uijmkkcrkqmphdmndhxvqpoakktjxrgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717943.957206-356-68243985044095/AnsiballZ_dnf.py'
Dec 02 23:25:44 compute-0 sudo[48189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:44 compute-0 python3.9[48191]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:46 compute-0 sudo[48189]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:47 compute-0 sudo[48349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwpprtyowtytjunrasmqgsjyurtbunwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717946.6912355-376-63283132952562/AnsiballZ_dnf.py'
Dec 02 23:25:47 compute-0 sudo[48349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:47 compute-0 python3.9[48351]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:48 compute-0 sudo[48349]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:49 compute-0 sudo[48502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nezkaenrlmzsrxyjvqttyaiuiwjhogzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717948.8105464-394-132072552094230/AnsiballZ_dnf.py'
Dec 02 23:25:49 compute-0 sudo[48502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:49 compute-0 python3.9[48504]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:50 compute-0 sudo[48502]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:51 compute-0 sudo[48655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuyxvgbgvsukbzofsphczmebhashsmev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717951.0495849-416-252232107541261/AnsiballZ_dnf.py'
Dec 02 23:25:51 compute-0 sudo[48655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:51 compute-0 python3.9[48657]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:53 compute-0 sudo[48655]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:53 compute-0 sudo[48811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzuxgmsltplxltsvixjrbsvnphysfgge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717953.3338535-432-105896274288603/AnsiballZ_dnf.py'
Dec 02 23:25:53 compute-0 sudo[48811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:53 compute-0 python3.9[48813]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:55 compute-0 sudo[48811]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:56 compute-0 sudo[48980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rddttddmgjaaxsnfczjbkgdtupuqpdes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717956.3869402-450-99164092540922/AnsiballZ_dnf.py'
Dec 02 23:25:56 compute-0 sudo[48980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:56 compute-0 python3.9[48982]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:25:58 compute-0 sudo[48980]: pam_unix(sudo:session): session closed for user root
Dec 02 23:25:58 compute-0 sudo[49133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmkxjcjzermlwiudjsnwkhqqqmcvkrpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717958.3934577-468-67773299350104/AnsiballZ_dnf.py'
Dec 02 23:25:58 compute-0 sudo[49133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:25:59 compute-0 python3.9[49135]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:26:10 compute-0 sudo[49133]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:10 compute-0 sudo[49471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdcdybwooexapbzppchxuxqgzqsyxojd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717970.682038-486-245554048992417/AnsiballZ_dnf.py'
Dec 02 23:26:10 compute-0 sudo[49471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:11 compute-0 python3.9[49473]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:26:12 compute-0 sudo[49471]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:13 compute-0 sudo[49627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwsashypdeucgxbjtykjsdqapqlrbbvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717972.9444711-508-109879030416002/AnsiballZ_file.py'
Dec 02 23:26:13 compute-0 sudo[49627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:13 compute-0 python3.9[49629]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:26:13 compute-0 sudo[49627]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:14 compute-0 sudo[49802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgvmiimsedmxqitkqrbhruxxfkjcdskr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717973.814008-524-280579906102638/AnsiballZ_stat.py'
Dec 02 23:26:14 compute-0 sudo[49802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:14 compute-0 python3.9[49804]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:26:14 compute-0 sudo[49802]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:14 compute-0 sudo[49925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkfgywklouguzjxadczltvvlyngvyxou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717973.814008-524-280579906102638/AnsiballZ_copy.py'
Dec 02 23:26:14 compute-0 sudo[49925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:15 compute-0 python3.9[49927]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764717973.814008-524-280579906102638/.source.json _original_basename=.lnm4ego8 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:26:15 compute-0 sudo[49925]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:16 compute-0 sudo[50077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woygbrwvhocpsszkyytdrbouzouricgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717975.531871-560-52304930541148/AnsiballZ_podman_image.py'
Dec 02 23:26:16 compute-0 sudo[50077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:16 compute-0 python3.9[50079]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 23:26:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1880457734-lower\x2dmapped.mount: Deactivated successfully.
Dec 02 23:26:22 compute-0 podman[50091]: 2025-12-02 23:26:22.535839714 +0000 UTC m=+6.121542085 image pull 78889ae0cf8c3740f43b6df72a2c4568ab589fb816614851d476abc277d3fffb 38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Dec 02 23:26:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:22 compute-0 sudo[50077]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:23 compute-0 sudo[50383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dykvpwzywzjmltsuriikbeodjvuxakcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717983.112308-582-154403566253072/AnsiballZ_podman_image.py'
Dec 02 23:26:23 compute-0 sudo[50383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:23 compute-0 python3.9[50385]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 23:26:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:33 compute-0 podman[50398]: 2025-12-02 23:26:33.745060361 +0000 UTC m=+10.103277559 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 02 23:26:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:33 compute-0 sudo[50383]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:34 compute-0 sudo[50696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwhxaevkcsxmlrbrjoysbmakurouxxlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717994.323497-602-133814009507306/AnsiballZ_podman_image.py'
Dec 02 23:26:34 compute-0 sudo[50696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:34 compute-0 python3.9[50698]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 23:26:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:35 compute-0 podman[50710]: 2025-12-02 23:26:35.30528939 +0000 UTC m=+0.342510557 image pull 13a8acc03c3934b75192e1b3a8c127f56bf115253a854621e8e0e8b6330d5e9b 38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Dec 02 23:26:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:35 compute-0 sudo[50696]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:36 compute-0 sudo[50944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmajovkjkcviddeyntpyfrmhghclnotr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764717995.9238405-620-203951651240924/AnsiballZ_podman_image.py'
Dec 02 23:26:36 compute-0 sudo[50944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:36 compute-0 python3.9[50946]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 23:26:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:51 compute-0 podman[50960]: 2025-12-02 23:26:51.445899774 +0000 UTC m=+14.844990226 image pull 99c98706e6d475ab9a9b50baf3431e8745aac38f98f776ef6ab7d3c7a2811699 38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Dec 02 23:26:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:51 compute-0 sudo[50944]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:52 compute-0 sudo[51245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcnklzxubiqecgjezswifgavhpkiwout ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718012.1405728-642-216028437117421/AnsiballZ_podman_image.py'
Dec 02 23:26:52 compute-0 sudo[51245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:52 compute-0 python3.9[51247]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.2:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 23:26:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:54 compute-0 podman[51259]: 2025-12-02 23:26:54.416711957 +0000 UTC m=+1.733131615 image pull f524ba1018a442a347cd0e4973fee00e2d9be36d16bf76224f04e0d02efc067e 38.102.83.2:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest
Dec 02 23:26:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:54 compute-0 sudo[51245]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:54 compute-0 sudo[51514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjkptffqxoxbglizhziiinlaktnlfaio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718014.7224464-642-170485360902532/AnsiballZ_podman_image.py'
Dec 02 23:26:54 compute-0 sudo[51514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:26:55 compute-0 python3.9[51516]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 02 23:26:56 compute-0 podman[51528]: 2025-12-02 23:26:56.271795389 +0000 UTC m=+1.081116568 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec 02 23:26:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:26:56 compute-0 sudo[51514]: pam_unix(sudo:session): session closed for user root
Dec 02 23:26:59 compute-0 sshd-session[45238]: Connection closed by 192.168.122.30 port 56358
Dec 02 23:26:59 compute-0 sshd-session[45235]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:26:59 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Dec 02 23:26:59 compute-0 systemd[1]: session-11.scope: Consumed 1min 47.590s CPU time.
Dec 02 23:26:59 compute-0 systemd-logind[795]: Session 11 logged out. Waiting for processes to exit.
Dec 02 23:26:59 compute-0 systemd-logind[795]: Removed session 11.
Dec 02 23:27:04 compute-0 sshd-session[51676]: Accepted publickey for zuul from 192.168.122.30 port 51678 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:27:04 compute-0 systemd-logind[795]: New session 12 of user zuul.
Dec 02 23:27:04 compute-0 systemd[1]: Started Session 12 of User zuul.
Dec 02 23:27:04 compute-0 sshd-session[51676]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:27:05 compute-0 python3.9[51829]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:27:07 compute-0 sudo[51983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izxfbmwqitjclkvlbckykccdpwwgwdwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718026.522804-52-128750695248347/AnsiballZ_getent.py'
Dec 02 23:27:07 compute-0 sudo[51983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:07 compute-0 python3.9[51985]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 02 23:27:07 compute-0 sudo[51983]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:07 compute-0 sudo[52136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbogpxobsplzjvhjquwqpqcblzpqvbme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718027.4266837-68-90559768022170/AnsiballZ_group.py'
Dec 02 23:27:07 compute-0 sudo[52136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:08 compute-0 python3.9[52138]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 23:27:08 compute-0 groupadd[52139]: group added to /etc/group: name=openvswitch, GID=42476
Dec 02 23:27:08 compute-0 groupadd[52139]: group added to /etc/gshadow: name=openvswitch
Dec 02 23:27:08 compute-0 groupadd[52139]: new group: name=openvswitch, GID=42476
Dec 02 23:27:08 compute-0 sudo[52136]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:09 compute-0 sudo[52294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhycqqzlflnidsnjnctnoiudlocuelwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718028.4483051-84-205139784435378/AnsiballZ_user.py'
Dec 02 23:27:09 compute-0 sudo[52294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:09 compute-0 python3.9[52296]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 02 23:27:09 compute-0 useradd[52298]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Dec 02 23:27:09 compute-0 useradd[52298]: add 'openvswitch' to group 'hugetlbfs'
Dec 02 23:27:09 compute-0 useradd[52298]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 02 23:27:09 compute-0 sudo[52294]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:10 compute-0 sudo[52454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkixsqhhnehzltklrkzrmjrhcqsmxfpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718029.8939135-104-173237754245663/AnsiballZ_setup.py'
Dec 02 23:27:10 compute-0 sudo[52454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:10 compute-0 python3.9[52456]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:27:10 compute-0 sudo[52454]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:11 compute-0 sudo[52538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clsxfbtnepkwjcakvocyfuuyjmprbino ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718029.8939135-104-173237754245663/AnsiballZ_dnf.py'
Dec 02 23:27:11 compute-0 sudo[52538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:11 compute-0 python3.9[52540]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:27:12 compute-0 sudo[52538]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:13 compute-0 sudo[52700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjqnatigotrfhauldheguxdqzjoavpoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718033.2483802-132-142886960282674/AnsiballZ_dnf.py'
Dec 02 23:27:13 compute-0 sudo[52700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:13 compute-0 python3.9[52702]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:27:26 compute-0 kernel: SELinux:  Converting 2733 SID table entries...
Dec 02 23:27:27 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:27:27 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 02 23:27:27 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:27:27 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:27:27 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:27:27 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:27:27 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:27:27 compute-0 groupadd[52725]: group added to /etc/group: name=unbound, GID=993
Dec 02 23:27:27 compute-0 groupadd[52725]: group added to /etc/gshadow: name=unbound
Dec 02 23:27:27 compute-0 groupadd[52725]: new group: name=unbound, GID=993
Dec 02 23:27:27 compute-0 useradd[52732]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Dec 02 23:27:27 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 02 23:27:27 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 02 23:27:28 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:27:28 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:27:28 compute-0 systemd[1]: Reloading.
Dec 02 23:27:28 compute-0 systemd-rc-local-generator[53230]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:27:28 compute-0 systemd-sysv-generator[53233]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:27:28 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:27:29 compute-0 sudo[52700]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:29 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:27:29 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:27:29 compute-0 systemd[1]: run-r25ef2b2816a7494181059845e8bc231c.service: Deactivated successfully.
Dec 02 23:27:30 compute-0 sudo[53798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acrztciowntzguhnadopcbwuokdzgoor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718049.4203374-148-46319901243946/AnsiballZ_systemd.py'
Dec 02 23:27:30 compute-0 sudo[53798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:30 compute-0 python3.9[53800]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:27:30 compute-0 systemd[1]: Reloading.
Dec 02 23:27:30 compute-0 systemd-sysv-generator[53830]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:27:30 compute-0 systemd-rc-local-generator[53827]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:27:30 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Dec 02 23:27:30 compute-0 chown[53842]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 02 23:27:30 compute-0 ovs-ctl[53847]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 02 23:27:30 compute-0 ovs-ctl[53847]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 02 23:27:30 compute-0 ovs-ctl[53847]: Starting ovsdb-server [  OK  ]
Dec 02 23:27:30 compute-0 ovs-vsctl[53896]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 02 23:27:31 compute-0 ovs-vsctl[53915]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"83290d9e-bd8f-4c21-b54d-356f7c3da39f\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec 02 23:27:31 compute-0 ovs-ctl[53847]: Configuring Open vSwitch system IDs [  OK  ]
Dec 02 23:27:31 compute-0 ovs-ctl[53847]: Enabling remote OVSDB managers [  OK  ]
Dec 02 23:27:31 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Dec 02 23:27:31 compute-0 ovs-vsctl[53921]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 02 23:27:31 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 02 23:27:31 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 02 23:27:31 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 02 23:27:31 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Dec 02 23:27:31 compute-0 ovs-ctl[53965]: Inserting openvswitch module [  OK  ]
Dec 02 23:27:31 compute-0 ovs-ctl[53934]: Starting ovs-vswitchd [  OK  ]
Dec 02 23:27:31 compute-0 ovs-vsctl[53982]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 02 23:27:31 compute-0 ovs-ctl[53934]: Enabling remote OVSDB managers [  OK  ]
Dec 02 23:27:31 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 02 23:27:31 compute-0 systemd[1]: Starting Open vSwitch...
Dec 02 23:27:31 compute-0 systemd[1]: Finished Open vSwitch.
Dec 02 23:27:31 compute-0 sudo[53798]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:32 compute-0 python3.9[54134]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:27:33 compute-0 sudo[54284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zclmzpuvuecnmmgzmimxqhuivgeqtzwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718052.7047834-184-75199678768327/AnsiballZ_sefcontext.py'
Dec 02 23:27:33 compute-0 sudo[54284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:33 compute-0 python3.9[54286]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 02 23:27:34 compute-0 kernel: SELinux:  Converting 2747 SID table entries...
Dec 02 23:27:34 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:27:34 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 02 23:27:34 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:27:34 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:27:34 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:27:34 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:27:34 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:27:34 compute-0 sudo[54284]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:35 compute-0 python3.9[54441]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:27:36 compute-0 sudo[54597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afnaelrdqtbabiafgrkvwcllrbptzwsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718056.2191634-220-153798455660209/AnsiballZ_dnf.py'
Dec 02 23:27:36 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 02 23:27:36 compute-0 sudo[54597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:36 compute-0 python3.9[54599]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:27:38 compute-0 sudo[54597]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:38 compute-0 sudo[54750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lylpmeagxhobgwfayuhcshumedkrycrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718058.2582896-236-14569655976161/AnsiballZ_command.py'
Dec 02 23:27:38 compute-0 sudo[54750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:38 compute-0 python3.9[54752]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:27:39 compute-0 sudo[54750]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:40 compute-0 sudo[55037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cryhcylnuzvelvfyisljxhqbapfdzhai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718060.0066924-252-228746908976285/AnsiballZ_file.py'
Dec 02 23:27:40 compute-0 sudo[55037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:40 compute-0 python3.9[55039]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 23:27:40 compute-0 sudo[55037]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:41 compute-0 python3.9[55189]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:27:42 compute-0 sudo[55341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsvsmbghqsotgucxqzfurlkijabdaymu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718061.9773138-284-68971419478764/AnsiballZ_dnf.py'
Dec 02 23:27:42 compute-0 sudo[55341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:42 compute-0 python3.9[55343]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:27:44 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:27:44 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:27:44 compute-0 systemd[1]: Reloading.
Dec 02 23:27:44 compute-0 systemd-rc-local-generator[55384]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:27:44 compute-0 systemd-sysv-generator[55387]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:27:44 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:27:44 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:27:44 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:27:44 compute-0 systemd[1]: run-r429891ff06cf4b9a9bb95d5da3f36878.service: Deactivated successfully.
Dec 02 23:27:44 compute-0 sudo[55341]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:45 compute-0 sudo[55659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqdyqkejmhfezmscvmxlsudsmpsbdhgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718065.136809-300-160690349384462/AnsiballZ_systemd.py'
Dec 02 23:27:45 compute-0 sudo[55659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:45 compute-0 python3.9[55661]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:27:45 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 02 23:27:45 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Dec 02 23:27:45 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Dec 02 23:27:45 compute-0 systemd[1]: Stopping Network Manager...
Dec 02 23:27:45 compute-0 NetworkManager[7181]: <info>  [1764718065.8201] caught SIGTERM, shutting down normally.
Dec 02 23:27:45 compute-0 NetworkManager[7181]: <info>  [1764718065.8226] dhcp4 (eth0): canceled DHCP transaction
Dec 02 23:27:45 compute-0 NetworkManager[7181]: <info>  [1764718065.8226] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 23:27:45 compute-0 NetworkManager[7181]: <info>  [1764718065.8226] dhcp4 (eth0): state changed no lease
Dec 02 23:27:45 compute-0 NetworkManager[7181]: <info>  [1764718065.8234] manager: NetworkManager state is now CONNECTED_SITE
Dec 02 23:27:45 compute-0 NetworkManager[7181]: <info>  [1764718065.8330] exiting (success)
Dec 02 23:27:45 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 23:27:45 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 23:27:45 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 02 23:27:45 compute-0 systemd[1]: Stopped Network Manager.
Dec 02 23:27:45 compute-0 systemd[1]: NetworkManager.service: Consumed 12.709s CPU time, 4.1M memory peak, read 0B from disk, written 41.0K to disk.
Dec 02 23:27:45 compute-0 systemd[1]: Starting Network Manager...
Dec 02 23:27:45 compute-0 NetworkManager[55671]: <info>  [1764718065.9069] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:13ba9be2-a183-422c-a29d-1c0aec36730d)
Dec 02 23:27:45 compute-0 NetworkManager[55671]: <info>  [1764718065.9071] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 02 23:27:45 compute-0 NetworkManager[55671]: <info>  [1764718065.9137] manager[0x557787c71090]: monitoring kernel firmware directory '/lib/firmware'.
Dec 02 23:27:45 compute-0 systemd[1]: Starting Hostname Service...
Dec 02 23:27:46 compute-0 systemd[1]: Started Hostname Service.
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0302] hostname: hostname: using hostnamed
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0303] hostname: static hostname changed from (none) to "compute-0"
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0308] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0314] manager[0x557787c71090]: rfkill: Wi-Fi hardware radio set enabled
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0315] manager[0x557787c71090]: rfkill: WWAN hardware radio set enabled
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0339] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0349] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0349] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0350] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0350] manager: Networking is enabled by state file
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0352] settings: Loaded settings plugin: keyfile (internal)
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0356] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0386] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0397] dhcp: init: Using DHCP client 'internal'
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0400] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0404] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0411] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0419] device (lo): Activation: starting connection 'lo' (c5650cb8-9795-426f-8b6b-42dce70d6cce)
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0427] device (eth0): carrier: link connected
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0431] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0436] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0437] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0443] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0451] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0457] device (eth1): carrier: link connected
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0461] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0466] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (2b145667-e1cd-593e-beec-410c178624e9) (indicated)
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0467] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0472] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0480] device (eth1): Activation: starting connection 'ci-private-network' (2b145667-e1cd-593e-beec-410c178624e9)
Dec 02 23:27:46 compute-0 systemd[1]: Started Network Manager.
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0487] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0497] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0499] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0502] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0504] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0508] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0511] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0515] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0519] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0528] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0531] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0541] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0555] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0564] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0566] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0571] device (lo): Activation: successful, device activated.
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0579] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0582] dhcp4 (eth0): state changed new lease, address=38.102.83.77
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0584] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0587] manager: NetworkManager state is now CONNECTED_LOCAL
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0591] device (eth1): Activation: successful, device activated.
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0601] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0671] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-0 systemd[1]: Starting Network Manager Wait Online...
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0704] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0706] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0709] manager: NetworkManager state is now CONNECTED_SITE
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0712] device (eth0): Activation: successful, device activated.
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0718] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 02 23:27:46 compute-0 NetworkManager[55671]: <info>  [1764718066.0743] manager: startup complete
Dec 02 23:27:46 compute-0 sudo[55659]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:46 compute-0 systemd[1]: Finished Network Manager Wait Online.
Dec 02 23:27:46 compute-0 sudo[55885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktzeufiyvygjumntxyiwrcbccskqxlbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718066.3371031-316-64777087817946/AnsiballZ_dnf.py'
Dec 02 23:27:46 compute-0 sudo[55885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:46 compute-0 python3.9[55887]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:27:51 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:27:51 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:27:51 compute-0 systemd[1]: Reloading.
Dec 02 23:27:51 compute-0 systemd-rc-local-generator[55935]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:27:51 compute-0 systemd-sysv-generator[55940]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:27:51 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:27:52 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:27:52 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:27:52 compute-0 systemd[1]: run-r747be86c4fba411bb9a41d1044a2a741.service: Deactivated successfully.
Dec 02 23:27:52 compute-0 sudo[55885]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:53 compute-0 sudo[56343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjzwtagkjsijqusegnxdncpjeqynhlgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718073.2867792-340-217623823745093/AnsiballZ_stat.py'
Dec 02 23:27:53 compute-0 sudo[56343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:53 compute-0 python3.9[56345]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:27:53 compute-0 sudo[56343]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:54 compute-0 sudo[56495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfvxylsbzcblcjbjgklfbjkltzupjtdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718074.1347768-358-12931729909013/AnsiballZ_ini_file.py'
Dec 02 23:27:54 compute-0 sudo[56495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:54 compute-0 python3.9[56497]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:27:54 compute-0 sudo[56495]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:55 compute-0 sudo[56649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nggtqzdvxyxaqddtsnfbxhysnwzwcmzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718075.235332-378-174226087026082/AnsiballZ_ini_file.py'
Dec 02 23:27:55 compute-0 sudo[56649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:55 compute-0 python3.9[56651]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:27:55 compute-0 sudo[56649]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:56 compute-0 sudo[56801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ourxbdhuxifkljfynsiglskzcftcszwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718075.900039-378-81223800328503/AnsiballZ_ini_file.py'
Dec 02 23:27:56 compute-0 sudo[56801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:56 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 23:27:56 compute-0 python3.9[56803]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:27:56 compute-0 sudo[56801]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:57 compute-0 sudo[56953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwksbuvwiedkkblhlxufkgpztdraechs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718076.6906927-408-126906847687949/AnsiballZ_ini_file.py'
Dec 02 23:27:57 compute-0 sudo[56953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:57 compute-0 python3.9[56955]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:27:57 compute-0 sudo[56953]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:57 compute-0 sudo[57105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksrznwdtstgyprlelbpwlqrhdhsukbys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718077.3862653-408-114762554813370/AnsiballZ_ini_file.py'
Dec 02 23:27:57 compute-0 sudo[57105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:57 compute-0 python3.9[57107]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:27:57 compute-0 sudo[57105]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:58 compute-0 sudo[57257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfzmxijhlgmhkzffnmwggpqqmhotiejf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718078.2280753-438-275983154945337/AnsiballZ_stat.py'
Dec 02 23:27:58 compute-0 sudo[57257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:58 compute-0 python3.9[57259]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:27:58 compute-0 sudo[57257]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:59 compute-0 sudo[57380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hksjteryzfbgueixcfmpucgmnjykndiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718078.2280753-438-275983154945337/AnsiballZ_copy.py'
Dec 02 23:27:59 compute-0 sudo[57380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:27:59 compute-0 python3.9[57382]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718078.2280753-438-275983154945337/.source _original_basename=.ubs07bb6 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:27:59 compute-0 sudo[57380]: pam_unix(sudo:session): session closed for user root
Dec 02 23:27:59 compute-0 sudo[57532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhnooewylsvfkjmmubynnekomikpdzcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718079.5940802-468-198760417807256/AnsiballZ_file.py'
Dec 02 23:27:59 compute-0 sudo[57532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:00 compute-0 python3.9[57534]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:28:00 compute-0 sudo[57532]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:00 compute-0 sudo[57684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqhytqggnwzxrsqrjfsjksuueqkpnpgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718080.3035128-484-150704232527890/AnsiballZ_edpm_os_net_config_mappings.py'
Dec 02 23:28:00 compute-0 sudo[57684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:01 compute-0 python3.9[57686]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 02 23:28:01 compute-0 sudo[57684]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:01 compute-0 sudo[57836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpruhliysquooycprkveotpgsczkqmgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718081.291964-502-11963202286527/AnsiballZ_file.py'
Dec 02 23:28:01 compute-0 sudo[57836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:01 compute-0 python3.9[57838]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:28:01 compute-0 sudo[57836]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:02 compute-0 sudo[57988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djydftteedqidyfwngwcnmsftatneupk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718082.3067508-522-179046093155827/AnsiballZ_stat.py'
Dec 02 23:28:02 compute-0 sudo[57988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:02 compute-0 sudo[57988]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:03 compute-0 sudo[58111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcrkbxqkoafslhfeqtxcnmslzantfmwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718082.3067508-522-179046093155827/AnsiballZ_copy.py'
Dec 02 23:28:03 compute-0 sudo[58111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:03 compute-0 sudo[58111]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:04 compute-0 sudo[58263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwgvomlcanixybwrfacnmgkuppttmpuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718083.554169-552-92119768248231/AnsiballZ_slurp.py'
Dec 02 23:28:04 compute-0 sudo[58263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:04 compute-0 python3.9[58265]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 02 23:28:04 compute-0 sudo[58263]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:05 compute-0 sudo[58438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jagfyosftxccllhdyynhvusxnyoqqbwf ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718084.459628-570-98570886684347/async_wrapper.py j229287603559 300 /home/zuul/.ansible/tmp/ansible-tmp-1764718084.459628-570-98570886684347/AnsiballZ_edpm_os_net_config.py _'
Dec 02 23:28:05 compute-0 sudo[58438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:05 compute-0 ansible-async_wrapper.py[58440]: Invoked with j229287603559 300 /home/zuul/.ansible/tmp/ansible-tmp-1764718084.459628-570-98570886684347/AnsiballZ_edpm_os_net_config.py _
Dec 02 23:28:05 compute-0 ansible-async_wrapper.py[58443]: Starting module and watcher
Dec 02 23:28:05 compute-0 ansible-async_wrapper.py[58443]: Start watching 58444 (300)
Dec 02 23:28:05 compute-0 ansible-async_wrapper.py[58444]: Start module (58444)
Dec 02 23:28:05 compute-0 ansible-async_wrapper.py[58440]: Return async_wrapper task started.
Dec 02 23:28:05 compute-0 sudo[58438]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:05 compute-0 python3.9[58445]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec 02 23:28:06 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 02 23:28:06 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 02 23:28:06 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec 02 23:28:06 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 02 23:28:06 compute-0 kernel: cfg80211: failed to load regulatory.db
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.0708] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.0722] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1213] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1214] audit: op="connection-add" uuid="6e3abe90-a217-40b1-a976-37a1e24a5034" name="br-ex-br" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1234] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1236] audit: op="connection-add" uuid="83a661bf-a532-49da-b821-76f978558d58" name="br-ex-port" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1251] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1253] audit: op="connection-add" uuid="34f0a2dc-755a-411b-a0e0-2f02829a5fcb" name="eth1-port" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1265] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1266] audit: op="connection-add" uuid="f2ec2f5b-c45f-4f85-8fe0-49bfaef9d926" name="vlan20-port" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1277] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1279] audit: op="connection-add" uuid="d6a2488f-6e9e-4ff1-a3a5-c2190d6e8522" name="vlan21-port" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1291] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1293] audit: op="connection-add" uuid="93ab8014-7981-408e-9cdc-51da5ad25faa" name="vlan22-port" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1312] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,connection.timestamp,connection.autoconnect-priority" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1330] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1331] audit: op="connection-add" uuid="0cd6044f-700b-4eb0-b010-952832b09e2d" name="br-ex-if" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1390] audit: op="connection-update" uuid="2b145667-e1cd-593e-beec-410c178624e9" name="ci-private-network" args="ovs-external-ids.data,ovs-interface.type,ipv4.dns,ipv4.never-default,ipv4.addresses,ipv4.routes,ipv4.method,ipv4.routing-rules,ipv6.addr-gen-mode,ipv6.dns,ipv6.addresses,ipv6.routes,ipv6.method,ipv6.routing-rules,connection.timestamp,connection.port-type,connection.controller,connection.master,connection.slave-type" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1406] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1408] audit: op="connection-add" uuid="6a91c662-e932-4d30-91be-2b9c39b1c25e" name="vlan20-if" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1426] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1428] audit: op="connection-add" uuid="344485a2-0e6f-4ff6-8383-19d95aaedd59" name="vlan21-if" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1444] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1448] audit: op="connection-add" uuid="8d5ac5e7-5768-4441-b985-4c1ea99331c1" name="vlan22-if" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1461] audit: op="connection-delete" uuid="c810458a-fafa-397b-a0ff-2bc5f1fe918f" name="Wired connection 1" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1474] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1483] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1487] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (6e3abe90-a217-40b1-a976-37a1e24a5034)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1488] audit: op="connection-activate" uuid="6e3abe90-a217-40b1-a976-37a1e24a5034" name="br-ex-br" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1490] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1496] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1501] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (83a661bf-a532-49da-b821-76f978558d58)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1503] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1509] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1513] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (34f0a2dc-755a-411b-a0e0-2f02829a5fcb)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1515] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1523] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1528] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (f2ec2f5b-c45f-4f85-8fe0-49bfaef9d926)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1532] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1540] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1544] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (d6a2488f-6e9e-4ff1-a3a5-c2190d6e8522)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1546] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1552] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1556] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (93ab8014-7981-408e-9cdc-51da5ad25faa)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1557] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1560] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1562] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1568] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1572] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1577] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (0cd6044f-700b-4eb0-b010-952832b09e2d)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1579] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1583] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1585] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1587] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1589] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1600] device (eth1): disconnecting for new activation request.
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1601] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1605] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1607] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1609] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1613] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1619] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1623] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (6a91c662-e932-4d30-91be-2b9c39b1c25e)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1624] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1628] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1631] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1633] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1637] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1642] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1647] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (344485a2-0e6f-4ff6-8383-19d95aaedd59)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1649] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1652] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1655] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1657] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1661] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1667] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1672] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (8d5ac5e7-5768-4441-b985-4c1ea99331c1)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1673] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1677] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1680] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1682] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1684] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1696] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,connection.autoconnect-priority" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1698] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1701] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1703] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1710] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1713] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1720] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1723] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1725] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 kernel: ovs-system: entered promiscuous mode
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1730] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1747] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 systemd-udevd[58452]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1751] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1753] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1756] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1760] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 kernel: Timeout policy base is empty
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1768] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1770] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1776] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1780] dhcp4 (eth0): canceled DHCP transaction
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1780] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1780] dhcp4 (eth0): state changed no lease
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1782] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1792] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1795] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58446 uid=0 result="fail" reason="Device is not activated"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1824] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec 02 23:28:07 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1835] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1839] dhcp4 (eth0): state changed new lease, address=38.102.83.77
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1911] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.1919] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2036] device (eth1): Activation: starting connection 'ci-private-network' (2b145667-e1cd-593e-beec-410c178624e9)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2041] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2044] device (eth1): state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2053] device (eth1): disconnecting for new activation request.
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2055] audit: op="connection-activate" uuid="2b145667-e1cd-593e-beec-410c178624e9" name="ci-private-network" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2055] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2057] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2064] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2072] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2078] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2085] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2091] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2100] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2106] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2109] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2113] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2122] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2129] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2134] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2142] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2149] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2164] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2168] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2178] device (eth1): Activation: starting connection 'ci-private-network' (2b145667-e1cd-593e-beec-410c178624e9)
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2184] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58446 uid=0 result="success"
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2189] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2195] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2203] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2236] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2240] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2256] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2259] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2266] device (eth1): Activation: successful, device activated.
Dec 02 23:28:07 compute-0 kernel: br-ex: entered promiscuous mode
Dec 02 23:28:07 compute-0 kernel: vlan22: entered promiscuous mode
Dec 02 23:28:07 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec 02 23:28:07 compute-0 systemd-udevd[58451]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2416] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2428] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2472] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2474] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2480] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 02 23:28:07 compute-0 kernel: vlan21: entered promiscuous mode
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2531] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2548] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2566] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2568] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2574] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 02 23:28:07 compute-0 kernel: vlan20: entered promiscuous mode
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2611] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2628] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2658] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2663] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2669] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2704] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2716] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2743] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2745] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 02 23:28:07 compute-0 NetworkManager[55671]: <info>  [1764718087.2751] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 02 23:28:08 compute-0 NetworkManager[55671]: <info>  [1764718088.3940] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58446 uid=0 result="success"
Dec 02 23:28:08 compute-0 NetworkManager[55671]: <info>  [1764718088.5197] checkpoint[0x557787c46950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec 02 23:28:08 compute-0 NetworkManager[55671]: <info>  [1764718088.5198] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58446 uid=0 result="success"
Dec 02 23:28:08 compute-0 NetworkManager[55671]: <info>  [1764718088.7693] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58446 uid=0 result="success"
Dec 02 23:28:08 compute-0 NetworkManager[55671]: <info>  [1764718088.7703] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58446 uid=0 result="success"
Dec 02 23:28:08 compute-0 sudo[58783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxwzvaorosxsacyzpvkkmvbmqvbgmirv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718088.4021397-570-213855493697213/AnsiballZ_async_status.py'
Dec 02 23:28:08 compute-0 sudo[58783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:08 compute-0 NetworkManager[55671]: <info>  [1764718088.9656] audit: op="networking-control" arg="global-dns-configuration" pid=58446 uid=0 result="success"
Dec 02 23:28:08 compute-0 NetworkManager[55671]: <info>  [1764718088.9683] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec 02 23:28:08 compute-0 NetworkManager[55671]: <info>  [1764718088.9717] audit: op="networking-control" arg="global-dns-configuration" pid=58446 uid=0 result="success"
Dec 02 23:28:08 compute-0 NetworkManager[55671]: <info>  [1764718088.9736] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58446 uid=0 result="success"
Dec 02 23:28:09 compute-0 NetworkManager[55671]: <info>  [1764718089.1046] checkpoint[0x557787c46a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec 02 23:28:09 compute-0 NetworkManager[55671]: <info>  [1764718089.1053] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58446 uid=0 result="success"
Dec 02 23:28:09 compute-0 ansible-async_wrapper.py[58444]: Module complete (58444)
Dec 02 23:28:09 compute-0 python3.9[58785]: ansible-ansible.legacy.async_status Invoked with jid=j229287603559.58440 mode=status _async_dir=/root/.ansible_async
Dec 02 23:28:09 compute-0 sudo[58783]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:09 compute-0 sudo[58887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuwahoparkcpjwwrknihfsqgqjuexkbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718088.4021397-570-213855493697213/AnsiballZ_async_status.py'
Dec 02 23:28:09 compute-0 sudo[58887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:09 compute-0 python3.9[58889]: ansible-ansible.legacy.async_status Invoked with jid=j229287603559.58440 mode=cleanup _async_dir=/root/.ansible_async
Dec 02 23:28:09 compute-0 sudo[58887]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:10 compute-0 ansible-async_wrapper.py[58443]: Done in kid B.
Dec 02 23:28:13 compute-0 sshd[1004]: Timeout before authentication for connection from 45.78.218.154 to 38.102.83.77, pid = 49311
Dec 02 23:28:13 compute-0 sudo[59040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uibfvfazjczfzotcyqhqqqetbbeutnog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718093.444782-619-230877980417646/AnsiballZ_stat.py'
Dec 02 23:28:13 compute-0 sudo[59040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:13 compute-0 python3.9[59042]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:28:13 compute-0 sudo[59040]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:14 compute-0 sudo[59163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xckgnmrpffwudkylnwqwplfolzqkzhcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718093.444782-619-230877980417646/AnsiballZ_copy.py'
Dec 02 23:28:14 compute-0 sudo[59163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:14 compute-0 python3.9[59165]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718093.444782-619-230877980417646/.source.returncode _original_basename=.vsc5hf_z follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:28:14 compute-0 sudo[59163]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:14 compute-0 sudo[59315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdszhwazzovvmnwluvsietxwvdfyagpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718094.6809225-651-269633987116930/AnsiballZ_stat.py'
Dec 02 23:28:14 compute-0 sudo[59315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:15 compute-0 python3.9[59317]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:28:15 compute-0 sudo[59315]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:15 compute-0 sudo[59438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szluezyeadmbpdjqnufaahdkclyfxifz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718094.6809225-651-269633987116930/AnsiballZ_copy.py'
Dec 02 23:28:15 compute-0 sudo[59438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:15 compute-0 python3.9[59440]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718094.6809225-651-269633987116930/.source.cfg _original_basename=.nubjldf5 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:28:15 compute-0 sudo[59438]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:16 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 02 23:28:16 compute-0 sudo[59593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oczqwxhrcnauwztpdabkbaakbdbbnusk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718095.9330883-681-27015836042033/AnsiballZ_systemd.py'
Dec 02 23:28:16 compute-0 sudo[59593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:16 compute-0 python3.9[59595]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:28:16 compute-0 systemd[1]: Reloading Network Manager...
Dec 02 23:28:16 compute-0 NetworkManager[55671]: <info>  [1764718096.6627] audit: op="reload" arg="0" pid=59599 uid=0 result="success"
Dec 02 23:28:16 compute-0 NetworkManager[55671]: <info>  [1764718096.6637] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec 02 23:28:16 compute-0 systemd[1]: Reloaded Network Manager.
Dec 02 23:28:16 compute-0 sudo[59593]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:17 compute-0 sshd-session[51679]: Connection closed by 192.168.122.30 port 51678
Dec 02 23:28:17 compute-0 sshd-session[51676]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:28:17 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Dec 02 23:28:17 compute-0 systemd[1]: session-12.scope: Consumed 48.373s CPU time.
Dec 02 23:28:17 compute-0 systemd-logind[795]: Session 12 logged out. Waiting for processes to exit.
Dec 02 23:28:17 compute-0 systemd-logind[795]: Removed session 12.
Dec 02 23:28:22 compute-0 sshd-session[59630]: Accepted publickey for zuul from 192.168.122.30 port 48070 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:28:22 compute-0 systemd-logind[795]: New session 13 of user zuul.
Dec 02 23:28:22 compute-0 systemd[1]: Started Session 13 of User zuul.
Dec 02 23:28:22 compute-0 sshd-session[59630]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:28:23 compute-0 python3.9[59783]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:28:24 compute-0 python3.9[59937]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:28:26 compute-0 python3.9[60127]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:28:26 compute-0 sshd-session[59633]: Connection closed by 192.168.122.30 port 48070
Dec 02 23:28:26 compute-0 sshd-session[59630]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:28:26 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Dec 02 23:28:26 compute-0 systemd[1]: session-13.scope: Consumed 2.418s CPU time.
Dec 02 23:28:26 compute-0 systemd-logind[795]: Session 13 logged out. Waiting for processes to exit.
Dec 02 23:28:26 compute-0 systemd-logind[795]: Removed session 13.
Dec 02 23:28:26 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 02 23:28:30 compute-0 sshd[1004]: drop connection #0 from [45.78.218.154]:60690 on [38.102.83.77]:22 penalty: exceeded LoginGraceTime
Dec 02 23:28:31 compute-0 sshd-session[60155]: Accepted publickey for zuul from 192.168.122.30 port 47612 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:28:31 compute-0 systemd-logind[795]: New session 14 of user zuul.
Dec 02 23:28:31 compute-0 systemd[1]: Started Session 14 of User zuul.
Dec 02 23:28:31 compute-0 sshd-session[60155]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:28:32 compute-0 python3.9[60309]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:28:33 compute-0 python3.9[60463]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:28:34 compute-0 sudo[60617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkcojkchszwgcynapqnedhgjrylbeafm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718114.0780041-60-116072752351814/AnsiballZ_setup.py'
Dec 02 23:28:34 compute-0 sudo[60617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:34 compute-0 python3.9[60619]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:28:34 compute-0 sudo[60617]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:35 compute-0 sudo[60702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjqgrwujtakjzklueqalbrqzqkfqkaad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718114.0780041-60-116072752351814/AnsiballZ_dnf.py'
Dec 02 23:28:35 compute-0 sudo[60702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:35 compute-0 python3.9[60704]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:28:36 compute-0 sudo[60702]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:37 compute-0 sudo[60855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfojjvbfpgpowcpsbpfykqvznhrpebmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718116.9214902-84-250174027498739/AnsiballZ_setup.py'
Dec 02 23:28:37 compute-0 sudo[60855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:37 compute-0 python3.9[60857]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:28:37 compute-0 sudo[60855]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:38 compute-0 sudo[61047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffidrngypebetwhhewhgdymqgafmrswb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718118.2477756-106-78725938337997/AnsiballZ_file.py'
Dec 02 23:28:38 compute-0 sudo[61047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:39 compute-0 python3.9[61049]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:28:39 compute-0 sudo[61047]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:39 compute-0 sudo[61199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frlzcxrynzqjxxfwpfykantokqqbhnho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718119.2894998-122-260669144144025/AnsiballZ_command.py'
Dec 02 23:28:39 compute-0 sudo[61199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:39 compute-0 python3.9[61201]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:28:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:28:39 compute-0 sudo[61199]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:40 compute-0 sudo[61362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abkdtwyvkjwqcefgkbkmjjlgvxffjsrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718120.351734-138-211330199298442/AnsiballZ_stat.py'
Dec 02 23:28:40 compute-0 sudo[61362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:41 compute-0 python3.9[61364]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:28:41 compute-0 sudo[61362]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:41 compute-0 sudo[61440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wghjkoanjoekymayxdvofpugjaotfyuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718120.351734-138-211330199298442/AnsiballZ_file.py'
Dec 02 23:28:41 compute-0 sudo[61440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:41 compute-0 python3.9[61442]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:28:41 compute-0 sudo[61440]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:42 compute-0 sudo[61592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fijataephiawmwsnnyrmgbhmhfilxpid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718121.840227-162-235912112926526/AnsiballZ_stat.py'
Dec 02 23:28:42 compute-0 sudo[61592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:42 compute-0 python3.9[61594]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:28:42 compute-0 sudo[61592]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:42 compute-0 sudo[61670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vthndurvsillrmrbqkfihfigmpnkmhqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718121.840227-162-235912112926526/AnsiballZ_file.py'
Dec 02 23:28:42 compute-0 sudo[61670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:42 compute-0 python3.9[61672]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:28:42 compute-0 sudo[61670]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:43 compute-0 sudo[61822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iedwdatcnmgzyynwdcpydozmmmcuqlrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718123.3083522-188-151563853923863/AnsiballZ_ini_file.py'
Dec 02 23:28:43 compute-0 sudo[61822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:44 compute-0 python3.9[61824]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:28:44 compute-0 sudo[61822]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:44 compute-0 sudo[61974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnyxoncsirwijuuoxrzrudvflkdjhrho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718124.2588441-188-142477084476876/AnsiballZ_ini_file.py'
Dec 02 23:28:44 compute-0 sudo[61974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:44 compute-0 python3.9[61976]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:28:44 compute-0 sudo[61974]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:45 compute-0 sudo[62126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmqyojjslyitpdwrnuqckichcifvfaow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718124.901352-188-79730381071136/AnsiballZ_ini_file.py'
Dec 02 23:28:45 compute-0 sudo[62126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:45 compute-0 python3.9[62128]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:28:45 compute-0 sudo[62126]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:45 compute-0 sudo[62278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bswtlwtjmwgpauyrlkdmbqsxwglhqyfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718125.4983292-188-207873402908708/AnsiballZ_ini_file.py'
Dec 02 23:28:45 compute-0 sudo[62278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:45 compute-0 python3.9[62280]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:28:45 compute-0 sudo[62278]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:46 compute-0 sudo[62430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrtfdpduiklxjghcbxqbkqsgbnsavcev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718126.4260898-250-248897685092833/AnsiballZ_dnf.py'
Dec 02 23:28:46 compute-0 sudo[62430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:46 compute-0 python3.9[62432]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:28:48 compute-0 sudo[62430]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:48 compute-0 sudo[62583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jexyxkzzfvyonbsxjiuioiowrjqxbone ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718128.584013-272-231980737115156/AnsiballZ_setup.py'
Dec 02 23:28:48 compute-0 sudo[62583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:49 compute-0 python3.9[62585]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:28:50 compute-0 sudo[62583]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:50 compute-0 sudo[62737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfcegkhqtwdgdxhygdsgmzdkkokhektv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718130.1973584-288-97470450078280/AnsiballZ_stat.py'
Dec 02 23:28:50 compute-0 sudo[62737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:50 compute-0 python3.9[62739]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:28:50 compute-0 sudo[62737]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:51 compute-0 sudo[62889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebfajiiermlsxhyxwyundbgnzvpgjapb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718131.0247972-306-60052142389137/AnsiballZ_stat.py'
Dec 02 23:28:51 compute-0 sudo[62889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:51 compute-0 python3.9[62891]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:28:51 compute-0 sudo[62889]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:52 compute-0 sudo[63041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eudbdbhuhnpllcwnyldkzsptqonwqxzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718131.9891922-326-232632383210461/AnsiballZ_command.py'
Dec 02 23:28:52 compute-0 sudo[63041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:52 compute-0 python3.9[63043]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:28:52 compute-0 sudo[63041]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:53 compute-0 sudo[63194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkbanybmoigscmpezvwmpobbvyyinvys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718132.8285472-346-201070432273751/AnsiballZ_service_facts.py'
Dec 02 23:28:53 compute-0 sudo[63194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:53 compute-0 python3.9[63196]: ansible-service_facts Invoked
Dec 02 23:28:53 compute-0 network[63213]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:28:53 compute-0 network[63214]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:28:53 compute-0 network[63215]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:28:55 compute-0 sudo[63194]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:57 compute-0 sudo[63498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwgifxjbmuhikboeeqjscxuwmabsfxbw ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764718137.22219-376-80136514478869/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764718137.22219-376-80136514478869/args'
Dec 02 23:28:57 compute-0 sudo[63498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:57 compute-0 sudo[63498]: pam_unix(sudo:session): session closed for user root
Dec 02 23:28:58 compute-0 sudo[63665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcgfubbifzlotbbwlpwnvcfvbwzertjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718137.951407-398-85048980540130/AnsiballZ_dnf.py'
Dec 02 23:28:58 compute-0 sudo[63665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:28:58 compute-0 python3.9[63667]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:28:59 compute-0 sudo[63665]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:00 compute-0 sudo[63818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykrwdyctfmvdmsmiejublkexwelthwdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718140.1448414-424-84053370373209/AnsiballZ_package_facts.py'
Dec 02 23:29:00 compute-0 sudo[63818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:01 compute-0 python3.9[63820]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 02 23:29:01 compute-0 sudo[63818]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:02 compute-0 sudo[63970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qufboresrxmlrncxgjtuxwuiykgwuxgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718141.8402753-444-169312161910801/AnsiballZ_stat.py'
Dec 02 23:29:02 compute-0 sudo[63970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:02 compute-0 python3.9[63972]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:02 compute-0 sudo[63970]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:02 compute-0 sudo[64095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isxjeeutbdutqrejwyylkoxktedkofid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718141.8402753-444-169312161910801/AnsiballZ_copy.py'
Dec 02 23:29:02 compute-0 sudo[64095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:03 compute-0 python3.9[64097]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718141.8402753-444-169312161910801/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:03 compute-0 sudo[64095]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:03 compute-0 sudo[64249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuaozlxxsicnglpmsjzgvmmfwdrfsfxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718143.5000966-474-260462066539263/AnsiballZ_stat.py'
Dec 02 23:29:03 compute-0 sudo[64249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:04 compute-0 python3.9[64251]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:04 compute-0 sudo[64249]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:04 compute-0 sudo[64374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euqrlaeuxscwoiyexpjxrchcbxeqkltt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718143.5000966-474-260462066539263/AnsiballZ_copy.py'
Dec 02 23:29:04 compute-0 sudo[64374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:04 compute-0 python3.9[64376]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718143.5000966-474-260462066539263/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:04 compute-0 sudo[64374]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:05 compute-0 sudo[64528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgijmkvrbmsoqyqvyriwwxwchjlkrelb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718145.4794512-516-149250939394417/AnsiballZ_lineinfile.py'
Dec 02 23:29:05 compute-0 sudo[64528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:06 compute-0 python3.9[64530]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:06 compute-0 sudo[64528]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:07 compute-0 sudo[64682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuwwxoghyymulwqptwdmzdxkltfzlzli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718147.083543-546-207693323926320/AnsiballZ_setup.py'
Dec 02 23:29:07 compute-0 sudo[64682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:07 compute-0 python3.9[64684]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:29:07 compute-0 sudo[64682]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:08 compute-0 sudo[64766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amrrzqhgbpgaqryqupsafvhywzikgxnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718147.083543-546-207693323926320/AnsiballZ_systemd.py'
Dec 02 23:29:08 compute-0 sudo[64766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:08 compute-0 python3.9[64768]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:29:08 compute-0 sudo[64766]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:09 compute-0 sudo[64920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjgazyfhqdnrxdbqpuamzfoxzffkqabx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718149.574155-578-251004155128731/AnsiballZ_setup.py'
Dec 02 23:29:09 compute-0 sudo[64920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:10 compute-0 python3.9[64922]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:29:10 compute-0 sudo[64920]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:10 compute-0 sudo[65004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgmfewsozudflzzcfuzwveojluiwropg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718149.574155-578-251004155128731/AnsiballZ_systemd.py'
Dec 02 23:29:10 compute-0 sudo[65004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:10 compute-0 python3.9[65006]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:29:10 compute-0 systemd[1]: Stopping NTP client/server...
Dec 02 23:29:10 compute-0 chronyd[785]: chronyd exiting
Dec 02 23:29:10 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Dec 02 23:29:10 compute-0 systemd[1]: Stopped NTP client/server.
Dec 02 23:29:10 compute-0 systemd[1]: Starting NTP client/server...
Dec 02 23:29:11 compute-0 chronyd[65014]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 02 23:29:11 compute-0 chronyd[65014]: Frequency -24.530 +/- 0.316 ppm read from /var/lib/chrony/drift
Dec 02 23:29:11 compute-0 chronyd[65014]: Loaded seccomp filter (level 2)
Dec 02 23:29:11 compute-0 systemd[1]: Started NTP client/server.
Dec 02 23:29:11 compute-0 sudo[65004]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:11 compute-0 sshd-session[60158]: Connection closed by 192.168.122.30 port 47612
Dec 02 23:29:11 compute-0 sshd-session[60155]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:29:11 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Dec 02 23:29:11 compute-0 systemd[1]: session-14.scope: Consumed 25.513s CPU time.
Dec 02 23:29:11 compute-0 systemd-logind[795]: Session 14 logged out. Waiting for processes to exit.
Dec 02 23:29:11 compute-0 systemd-logind[795]: Removed session 14.
Dec 02 23:29:17 compute-0 sshd-session[65040]: Accepted publickey for zuul from 192.168.122.30 port 50164 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:29:17 compute-0 systemd-logind[795]: New session 15 of user zuul.
Dec 02 23:29:17 compute-0 systemd[1]: Started Session 15 of User zuul.
Dec 02 23:29:17 compute-0 sshd-session[65040]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:29:18 compute-0 python3.9[65193]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:29:19 compute-0 sudo[65347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjilmhgvxjjkjzsvmybkckuohcfrichs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718158.874117-46-65591005611867/AnsiballZ_file.py'
Dec 02 23:29:19 compute-0 sudo[65347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:19 compute-0 python3.9[65349]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:19 compute-0 sudo[65347]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:20 compute-0 sudo[65522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtdyeinmynfdxmuwaddbubojkuxqubtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718159.7805169-62-21163378373629/AnsiballZ_stat.py'
Dec 02 23:29:20 compute-0 sudo[65522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:20 compute-0 python3.9[65524]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:20 compute-0 sudo[65522]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:20 compute-0 sudo[65600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olnnusjygnwhqpwvyntwortbdvvztgzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718159.7805169-62-21163378373629/AnsiballZ_file.py'
Dec 02 23:29:20 compute-0 sudo[65600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:21 compute-0 python3.9[65602]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.0p38mnn_ recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:21 compute-0 sudo[65600]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:21 compute-0 sudo[65752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmzbfuyskwsyzfyydmksrgoptusfvijo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718161.5162196-102-118687892761434/AnsiballZ_stat.py'
Dec 02 23:29:21 compute-0 sudo[65752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:22 compute-0 python3.9[65754]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:22 compute-0 sudo[65752]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:22 compute-0 sudo[65875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-attdgqrxtljiocxeruyydhtcrpyabomc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718161.5162196-102-118687892761434/AnsiballZ_copy.py'
Dec 02 23:29:22 compute-0 sudo[65875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:22 compute-0 python3.9[65877]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718161.5162196-102-118687892761434/.source _original_basename=.asdghjqr follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:22 compute-0 sudo[65875]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:23 compute-0 sudo[66027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uejpumfnmgcpvrdybdqwkqzsakneimxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718162.9084475-134-224585058595033/AnsiballZ_file.py'
Dec 02 23:29:23 compute-0 sudo[66027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:23 compute-0 python3.9[66029]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:29:23 compute-0 sudo[66027]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:24 compute-0 sudo[66179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxnggxxatuhmfxlojwqsipporgqggaxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718163.6769018-150-91930880989776/AnsiballZ_stat.py'
Dec 02 23:29:24 compute-0 sudo[66179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:24 compute-0 python3.9[66181]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:24 compute-0 sudo[66179]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:24 compute-0 sudo[66302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlymbuokorarimpnukpyrkgigytjikvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718163.6769018-150-91930880989776/AnsiballZ_copy.py'
Dec 02 23:29:24 compute-0 sudo[66302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:24 compute-0 python3.9[66304]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718163.6769018-150-91930880989776/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:29:24 compute-0 sudo[66302]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:25 compute-0 sudo[66454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sadolgrkaqeclgayepbltvjnzbuqliwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718164.8972018-150-30175895260148/AnsiballZ_stat.py'
Dec 02 23:29:25 compute-0 sudo[66454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:25 compute-0 python3.9[66456]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:25 compute-0 sudo[66454]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:25 compute-0 sudo[66577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmfggbugepdmfuspketymhhgksxtunoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718164.8972018-150-30175895260148/AnsiballZ_copy.py'
Dec 02 23:29:25 compute-0 sudo[66577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:25 compute-0 python3.9[66579]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718164.8972018-150-30175895260148/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:29:25 compute-0 sudo[66577]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:26 compute-0 sudo[66729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvzwnbihlxxpzykkzupwabulmyerqbda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718166.3538384-208-111273032181879/AnsiballZ_file.py'
Dec 02 23:29:26 compute-0 sudo[66729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:26 compute-0 python3.9[66731]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:26 compute-0 sudo[66729]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:27 compute-0 sudo[66881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tugxvckcpnlxvpfshxorqnygkvuvknjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718167.1995192-224-186733333615135/AnsiballZ_stat.py'
Dec 02 23:29:27 compute-0 sudo[66881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:27 compute-0 python3.9[66883]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:27 compute-0 sudo[66881]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:28 compute-0 sudo[67004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obfgbopmwxbozaqtonzeejwfkaehmmoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718167.1995192-224-186733333615135/AnsiballZ_copy.py'
Dec 02 23:29:28 compute-0 sudo[67004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:28 compute-0 python3.9[67006]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718167.1995192-224-186733333615135/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:28 compute-0 sudo[67004]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:28 compute-0 sudo[67156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfysfocoyxaijtnllvecbdnolkmasbyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718168.6004803-254-15941339762102/AnsiballZ_stat.py'
Dec 02 23:29:28 compute-0 sudo[67156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:29 compute-0 python3.9[67158]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:29 compute-0 sudo[67156]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:29 compute-0 sudo[67279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prfwhbsaajsswvbkxyasezmmkllflgtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718168.6004803-254-15941339762102/AnsiballZ_copy.py'
Dec 02 23:29:29 compute-0 sudo[67279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:29 compute-0 python3.9[67281]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718168.6004803-254-15941339762102/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:29 compute-0 sudo[67279]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:30 compute-0 sudo[67431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swzdvefqkzuzahgywjixatswluxpxldy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718170.0585322-284-2554590919706/AnsiballZ_systemd.py'
Dec 02 23:29:30 compute-0 sudo[67431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:30 compute-0 python3.9[67433]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:29:30 compute-0 systemd[1]: Reloading.
Dec 02 23:29:31 compute-0 systemd-rc-local-generator[67457]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:29:31 compute-0 systemd-sysv-generator[67460]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:29:31 compute-0 systemd[1]: Reloading.
Dec 02 23:29:31 compute-0 systemd-rc-local-generator[67499]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:29:31 compute-0 systemd-sysv-generator[67502]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:29:31 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Dec 02 23:29:31 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Dec 02 23:29:31 compute-0 sudo[67431]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:32 compute-0 sudo[67659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkyhvzfzxtzqqaklnsxnkeluwilmnefr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718171.766512-300-133923197106562/AnsiballZ_stat.py'
Dec 02 23:29:32 compute-0 sudo[67659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:32 compute-0 python3.9[67661]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:32 compute-0 sudo[67659]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:32 compute-0 sudo[67782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwlblqewiryjqcvwtddnumzytgminbai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718171.766512-300-133923197106562/AnsiballZ_copy.py'
Dec 02 23:29:32 compute-0 sudo[67782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:32 compute-0 python3.9[67784]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718171.766512-300-133923197106562/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:32 compute-0 sudo[67782]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:33 compute-0 sudo[67934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuwjesjdxffpsglcvwbunwjybbfhhdfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718173.238965-330-158976328722982/AnsiballZ_stat.py'
Dec 02 23:29:33 compute-0 sudo[67934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:33 compute-0 python3.9[67936]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:33 compute-0 sudo[67934]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:34 compute-0 sudo[68057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmhithhvcwxiviolpggjwnecmgrqarfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718173.238965-330-158976328722982/AnsiballZ_copy.py'
Dec 02 23:29:34 compute-0 sudo[68057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:34 compute-0 python3.9[68059]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718173.238965-330-158976328722982/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:34 compute-0 sudo[68057]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:35 compute-0 sudo[68209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfokucryfgxronfqvqhtadmflidxyufq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718174.714345-360-272056735269651/AnsiballZ_systemd.py'
Dec 02 23:29:35 compute-0 sudo[68209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:35 compute-0 python3.9[68211]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:29:35 compute-0 systemd[1]: Reloading.
Dec 02 23:29:35 compute-0 systemd-rc-local-generator[68239]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:29:35 compute-0 systemd-sysv-generator[68243]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:29:35 compute-0 systemd[1]: Reloading.
Dec 02 23:29:35 compute-0 systemd-rc-local-generator[68277]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:29:35 compute-0 systemd-sysv-generator[68281]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:29:35 compute-0 systemd[1]: Starting Create netns directory...
Dec 02 23:29:35 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 23:29:35 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 23:29:35 compute-0 systemd[1]: Finished Create netns directory.
Dec 02 23:29:35 compute-0 sudo[68209]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:36 compute-0 python3.9[68438]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:29:36 compute-0 network[68455]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:29:36 compute-0 network[68456]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:29:36 compute-0 network[68457]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:29:42 compute-0 sudo[68718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnmjakggbgxbpnhhufsouzvsprzotyuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718182.619408-392-273977739994282/AnsiballZ_systemd.py'
Dec 02 23:29:42 compute-0 sudo[68718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:43 compute-0 python3.9[68720]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:29:43 compute-0 systemd[1]: Reloading.
Dec 02 23:29:43 compute-0 systemd-rc-local-generator[68749]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:29:43 compute-0 systemd-sysv-generator[68752]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:29:43 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Dec 02 23:29:43 compute-0 iptables.init[68759]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec 02 23:29:43 compute-0 iptables.init[68759]: iptables: Flushing firewall rules: [  OK  ]
Dec 02 23:29:43 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Dec 02 23:29:43 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Dec 02 23:29:43 compute-0 sudo[68718]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:44 compute-0 sudo[68954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhqinyszdfiwwjxmdncuavfnakmkfruz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718183.9952593-392-42370699774506/AnsiballZ_systemd.py'
Dec 02 23:29:44 compute-0 sudo[68954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:44 compute-0 python3.9[68956]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:29:44 compute-0 sudo[68954]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:45 compute-0 sudo[69108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csjpqvuxororvmhvuqgpirnskamzpevs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718185.120426-424-225888346122469/AnsiballZ_systemd.py'
Dec 02 23:29:45 compute-0 sudo[69108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:45 compute-0 python3.9[69110]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:29:45 compute-0 systemd[1]: Reloading.
Dec 02 23:29:45 compute-0 systemd-rc-local-generator[69140]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:29:45 compute-0 systemd-sysv-generator[69145]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:29:45 compute-0 systemd[1]: Starting Netfilter Tables...
Dec 02 23:29:45 compute-0 systemd[1]: Finished Netfilter Tables.
Dec 02 23:29:45 compute-0 sudo[69108]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:46 compute-0 sudo[69301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gonfgfuqqrrhakvdzezizyjerkmcqrkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718186.4499862-440-167946618908058/AnsiballZ_command.py'
Dec 02 23:29:46 compute-0 sudo[69301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:47 compute-0 python3.9[69303]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:29:47 compute-0 sudo[69301]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:48 compute-0 sudo[69454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etcqnhenedpuhmcevrpuilkjtnhavesn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718187.7736094-468-145660846143243/AnsiballZ_stat.py'
Dec 02 23:29:48 compute-0 sudo[69454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:48 compute-0 python3.9[69456]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:48 compute-0 sudo[69454]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:48 compute-0 sudo[69579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aupnicqocpljjnuzsklthuzxihrvvicv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718187.7736094-468-145660846143243/AnsiballZ_copy.py'
Dec 02 23:29:48 compute-0 sudo[69579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:49 compute-0 python3.9[69581]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718187.7736094-468-145660846143243/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:49 compute-0 sudo[69579]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:50 compute-0 sudo[69732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpwwcuzsovshfuaeetvlhzzxtgudnhjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718189.52233-498-243116600825401/AnsiballZ_systemd.py'
Dec 02 23:29:50 compute-0 sudo[69732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:50 compute-0 python3.9[69734]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:29:50 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Dec 02 23:29:50 compute-0 sshd[1004]: Received SIGHUP; restarting.
Dec 02 23:29:50 compute-0 sshd[1004]: Server listening on 0.0.0.0 port 22.
Dec 02 23:29:50 compute-0 sshd[1004]: Server listening on :: port 22.
Dec 02 23:29:50 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Dec 02 23:29:50 compute-0 sudo[69732]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:50 compute-0 sudo[69888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thvoglglkufpnpicjotgfwzmtenmytew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718190.6153405-514-233849227153234/AnsiballZ_file.py'
Dec 02 23:29:50 compute-0 sudo[69888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:51 compute-0 python3.9[69890]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:51 compute-0 sudo[69888]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:51 compute-0 sudo[70040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnzevnndnmnxyhuhtyzikhwttepfjlpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718191.3979447-530-11809437999492/AnsiballZ_stat.py'
Dec 02 23:29:51 compute-0 sudo[70040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:51 compute-0 python3.9[70042]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:51 compute-0 sudo[70040]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:52 compute-0 sudo[70163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfnixqyavkuuhktipzejhtsebedeckpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718191.3979447-530-11809437999492/AnsiballZ_copy.py'
Dec 02 23:29:52 compute-0 sudo[70163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:52 compute-0 python3.9[70165]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718191.3979447-530-11809437999492/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:52 compute-0 sudo[70163]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:53 compute-0 sudo[70315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fggydjrwajwvkeuurnzjbwntmnfkxuzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718192.9757237-566-111651777418950/AnsiballZ_timezone.py'
Dec 02 23:29:53 compute-0 sudo[70315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:53 compute-0 python3.9[70317]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 02 23:29:53 compute-0 systemd[1]: Starting Time & Date Service...
Dec 02 23:29:53 compute-0 systemd[1]: Started Time & Date Service.
Dec 02 23:29:53 compute-0 sudo[70315]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:54 compute-0 sudo[70471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwxtfrooerfmxafqrqwykktwxnoufpsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718194.3267257-584-16006948025833/AnsiballZ_file.py'
Dec 02 23:29:54 compute-0 sudo[70471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:54 compute-0 python3.9[70473]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:54 compute-0 sudo[70471]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:55 compute-0 sudo[70623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxszueoamxytcfjbwgcmmxyfmyyorccl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718195.1464214-600-79812196350232/AnsiballZ_stat.py'
Dec 02 23:29:55 compute-0 sudo[70623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:55 compute-0 python3.9[70625]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:55 compute-0 sudo[70623]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:56 compute-0 sudo[70746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyxltemudpgjuqmnuwjvwmyjfkizsrsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718195.1464214-600-79812196350232/AnsiballZ_copy.py'
Dec 02 23:29:56 compute-0 sudo[70746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:56 compute-0 python3.9[70748]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718195.1464214-600-79812196350232/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:56 compute-0 sudo[70746]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:56 compute-0 sudo[70898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmaybgxttwkaerotbibvwhhrmhtnepjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718196.6322877-630-57336444138955/AnsiballZ_stat.py'
Dec 02 23:29:56 compute-0 sudo[70898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:57 compute-0 python3.9[70900]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:57 compute-0 sudo[70898]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:57 compute-0 sudo[71021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxrccsdzwyljbjcstsdxyzhtkblrplsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718196.6322877-630-57336444138955/AnsiballZ_copy.py'
Dec 02 23:29:57 compute-0 sudo[71021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:57 compute-0 python3.9[71023]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718196.6322877-630-57336444138955/.source.yaml _original_basename=.7ef3nrfa follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:57 compute-0 sudo[71021]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:58 compute-0 sudo[71173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuctnxtmbwisydrmpimxsdvkttcnslrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718198.0288305-660-233567295104217/AnsiballZ_stat.py'
Dec 02 23:29:58 compute-0 sudo[71173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:58 compute-0 python3.9[71175]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:29:58 compute-0 sudo[71173]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:58 compute-0 sudo[71296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wputifwaeleugamgwfhjtunvinycbbvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718198.0288305-660-233567295104217/AnsiballZ_copy.py'
Dec 02 23:29:58 compute-0 sudo[71296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:59 compute-0 python3.9[71298]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718198.0288305-660-233567295104217/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:29:59 compute-0 sudo[71296]: pam_unix(sudo:session): session closed for user root
Dec 02 23:29:59 compute-0 sudo[71448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgyxyucyefxbnkvafgwutnejubqxzltq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718199.4127142-690-261206014021152/AnsiballZ_command.py'
Dec 02 23:29:59 compute-0 sudo[71448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:29:59 compute-0 python3.9[71450]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:29:59 compute-0 sudo[71448]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:00 compute-0 sudo[71601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfvlzialnoaqpbpfyvoaawdmzniwbcmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718200.2869587-706-227637784238734/AnsiballZ_command.py'
Dec 02 23:30:00 compute-0 sudo[71601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:00 compute-0 python3.9[71603]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:30:00 compute-0 sudo[71601]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:01 compute-0 sudo[71754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uawsxxquolvdmrlonnvrzuwlhawlgyjm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718201.0480995-722-191447958903330/AnsiballZ_edpm_nftables_from_files.py'
Dec 02 23:30:01 compute-0 sudo[71754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:01 compute-0 python3[71756]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 02 23:30:01 compute-0 sudo[71754]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:02 compute-0 sudo[71906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfhddwpglqqhyjlinfglnhwbutcdreot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718202.0915878-738-135881927473785/AnsiballZ_stat.py'
Dec 02 23:30:02 compute-0 sudo[71906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:02 compute-0 python3.9[71908]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:30:02 compute-0 sudo[71906]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:02 compute-0 sudo[72029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olivenapdfwljoamhurjqovgyiiralfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718202.0915878-738-135881927473785/AnsiballZ_copy.py'
Dec 02 23:30:02 compute-0 sudo[72029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:03 compute-0 python3.9[72031]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718202.0915878-738-135881927473785/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:03 compute-0 sudo[72029]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:03 compute-0 sudo[72181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzroxcclhuoyjuazpdwqlgafkolqnspm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718203.565127-768-265723726918203/AnsiballZ_stat.py'
Dec 02 23:30:03 compute-0 sudo[72181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:04 compute-0 python3.9[72183]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:30:04 compute-0 sudo[72181]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:04 compute-0 sudo[72304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuxdkurmhkllmihsetjwfkqulhfgxxwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718203.565127-768-265723726918203/AnsiballZ_copy.py'
Dec 02 23:30:04 compute-0 sudo[72304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:04 compute-0 python3.9[72306]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718203.565127-768-265723726918203/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:04 compute-0 sudo[72304]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:05 compute-0 sudo[72456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-matbgsnvobvsyarlyirxjyzmfvzxrivc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718204.9767258-798-10830961579120/AnsiballZ_stat.py'
Dec 02 23:30:05 compute-0 sudo[72456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:05 compute-0 python3.9[72458]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:30:05 compute-0 sudo[72456]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:05 compute-0 sudo[72579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfxrxidaickqcregiloicqdjffugbicm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718204.9767258-798-10830961579120/AnsiballZ_copy.py'
Dec 02 23:30:05 compute-0 sudo[72579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:06 compute-0 python3.9[72581]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718204.9767258-798-10830961579120/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:06 compute-0 sudo[72579]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:06 compute-0 sudo[72731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-menufrvzinazsbfhqaxzknoajfosqwwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718206.4957032-828-23392971873026/AnsiballZ_stat.py'
Dec 02 23:30:06 compute-0 sudo[72731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:07 compute-0 python3.9[72733]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:30:07 compute-0 sudo[72731]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:07 compute-0 sudo[72854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wybcsrmiyaxwhzyqckqvfyumpqvbgqow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718206.4957032-828-23392971873026/AnsiballZ_copy.py'
Dec 02 23:30:07 compute-0 sudo[72854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:07 compute-0 python3.9[72856]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718206.4957032-828-23392971873026/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:07 compute-0 sudo[72854]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:08 compute-0 sudo[73006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpaqwsylylzaqvkhunooatyqqtlxbqcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718208.0252438-858-174017923475206/AnsiballZ_stat.py'
Dec 02 23:30:08 compute-0 sudo[73006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:08 compute-0 python3.9[73008]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:30:08 compute-0 sudo[73006]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:09 compute-0 sudo[73129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djujiskjeemrlmithrayzrnigavabwxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718208.0252438-858-174017923475206/AnsiballZ_copy.py'
Dec 02 23:30:09 compute-0 sudo[73129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:09 compute-0 python3.9[73131]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718208.0252438-858-174017923475206/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:09 compute-0 sudo[73129]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:09 compute-0 sudo[73281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdrlphcobevxcpflkmxnisdcrtwcedhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718209.6505597-888-88729765441145/AnsiballZ_file.py'
Dec 02 23:30:09 compute-0 sudo[73281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:10 compute-0 python3.9[73283]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:10 compute-0 sudo[73281]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:10 compute-0 sudo[73433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfxhylyjvptyffioegucnhbrfbdsjoff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718210.3804157-904-228170904609944/AnsiballZ_command.py'
Dec 02 23:30:10 compute-0 sudo[73433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:10 compute-0 python3.9[73435]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:30:10 compute-0 sudo[73433]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:11 compute-0 sudo[73592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joaguvjotdpkmbqzunkhvtufnaopjcuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718211.2299964-920-241716146759789/AnsiballZ_blockinfile.py'
Dec 02 23:30:11 compute-0 sudo[73592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:11 compute-0 python3.9[73594]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:12 compute-0 sudo[73592]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:12 compute-0 sudo[73745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjsrcjyrpprbovsvzpghlcyaqkfbcfmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718212.3422825-938-154293141134218/AnsiballZ_file.py'
Dec 02 23:30:12 compute-0 sudo[73745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:12 compute-0 python3.9[73747]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:12 compute-0 sudo[73745]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:13 compute-0 sudo[73897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wznibexkvmqmiznqmvothndplxtgfafb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718213.1037278-938-8781734034613/AnsiballZ_file.py'
Dec 02 23:30:13 compute-0 sudo[73897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:13 compute-0 python3.9[73899]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:13 compute-0 sudo[73897]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:14 compute-0 sudo[74049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgotaoqqzhwqwzsdwvccbmtnkkpmbgui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718213.8785815-968-27765625991974/AnsiballZ_mount.py'
Dec 02 23:30:14 compute-0 sudo[74049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:14 compute-0 python3.9[74051]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 02 23:30:14 compute-0 sudo[74049]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:14 compute-0 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:30:14 compute-0 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:30:15 compute-0 sudo[74203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsqgpyimzvhyowrxipmxlyrczvdrqiwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718214.8103828-968-93145148829424/AnsiballZ_mount.py'
Dec 02 23:30:15 compute-0 sudo[74203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:15 compute-0 python3.9[74205]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 02 23:30:15 compute-0 sudo[74203]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:15 compute-0 sshd-session[65043]: Connection closed by 192.168.122.30 port 50164
Dec 02 23:30:15 compute-0 sshd-session[65040]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:30:15 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Dec 02 23:30:15 compute-0 systemd[1]: session-15.scope: Consumed 36.706s CPU time.
Dec 02 23:30:15 compute-0 systemd-logind[795]: Session 15 logged out. Waiting for processes to exit.
Dec 02 23:30:15 compute-0 systemd-logind[795]: Removed session 15.
Dec 02 23:30:21 compute-0 sshd-session[74231]: Accepted publickey for zuul from 192.168.122.30 port 34868 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:30:21 compute-0 systemd-logind[795]: New session 16 of user zuul.
Dec 02 23:30:21 compute-0 systemd[1]: Started Session 16 of User zuul.
Dec 02 23:30:21 compute-0 sshd-session[74231]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:30:21 compute-0 sudo[74384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyonspgoaiwqumfasqcuxyjymnhyzjhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718221.4005535-17-174719452398053/AnsiballZ_tempfile.py'
Dec 02 23:30:21 compute-0 sudo[74384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:22 compute-0 python3.9[74386]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 02 23:30:22 compute-0 sudo[74384]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:22 compute-0 sudo[74536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbmhkdkgvyjeqwgwtdaurzmkrdnkspil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718222.3112218-41-34139994827163/AnsiballZ_stat.py'
Dec 02 23:30:22 compute-0 sudo[74536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:22 compute-0 python3.9[74538]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:30:22 compute-0 sudo[74536]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:23 compute-0 sudo[74688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkwmobjtnlvqczjckmpknmqxultxodzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718223.2136877-61-190244709410024/AnsiballZ_setup.py'
Dec 02 23:30:23 compute-0 sudo[74688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:23 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 02 23:30:24 compute-0 python3.9[74690]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:30:24 compute-0 sudo[74688]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:24 compute-0 sudo[74842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wclbzxxjmhwzfskqgjtlfblhdridcpod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718224.41824-78-107967764645016/AnsiballZ_blockinfile.py'
Dec 02 23:30:24 compute-0 sudo[74842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:25 compute-0 python3.9[74844]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsf4O3TC9mqG5pmtZKzDID4ioApUCWMMcMl4FlQ3yDYoM34JJMwDpWmYo9yQeQH7Zz/mYY4kvj34n9pP6UpZh8YYTgrDO/xB5m08yB19hZlBLNgcS18Dl3aCrBlPC7/HRLSXsBGMqfD0dYlcv577j+jpmLyeex2U43tAJwee5EE74TbgHK0hzWiqONZO0KoJC0q2wyLlOa+dZFsIK2fiLjTjwdANF3t0KH6yhzS2J92gfoAUepv4JPBZWhLkuLrx9JrcJMWhKakHpNoy4vezvWVHBo45bBMlwyJABzPuaDKGqpWVe2XSS7CMZRzLLdmpbxOAin0VmmvpX9tf58g912pgSVPka/24eTGrQyyI5roB63r1vZsR9IUAlwsTO90EmGrxGdvIsWQ/aOthlTYdw84AdrxbSpaCzyMYmvjtCVriPliEwTQXsIQHUKv0KyAD2kMAmgBBd/D3seyLAs+Y2xL+gWwoMxhMc6IZFOrzU5UBBAfnLyma1mkx7C3UEu9fs=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAwn7ixIsEYJITP+z2Du/TZpA7vY8Lre+cVRh8KJ//3C
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKfK6GHTjOVRrF8EHoAX2PLtXv9kBkV3qltQW2BmRTPleQCACp3cMUT6m/r0IFqYo99ZvC9bd5l7MrPzRIPHF/w=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvbCYuawkhhMNcsQPSLD9R8MzWMFWBfNxWjJmMzeDHne2TvGIY0DiG11UE+E0lc8fLWiTHmhGyf0GmYFXup/FZEGgMHiuow4IbBTKxK+1QNjVqFO2S2o7o9zF+NooyO1zc2vbwn6D0Is1C3Zk+kyNKxOqKipgjEeFmN+dLdOtNrq/adI/ddM7mbWoJ2sF51XQHbgEt1Ad0ezxCRV1w6buNRIFym2S6pTAPQnkbaqmgQT3Tuq6e45Yvcnw8RY/QvcsMEhodIUNRGQGu4EkUdnY3bG7ucdWSRq5NpUgGVJVxacGyWuQ3pT6V9Mwb5MmOF3C6Nl4E5in3zUjnxxqrfW2uPHaajuvmDlIVoVzZkkVyc7neL/UZ3sg0G8BhCBllzHACU1ZKBdAhC6sj6fZa7rLtzsXXGQq/7Tt1VLSr4A/hna1l3Re/GZ1nnhILvetATInRD43bQChUO6Qys+jY/aug2jC2YYQzxGcBWHZAsYtdcNvZXu+ilZgyhJlx4Mb5mTk=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINhbu3i8/fSUjpuw8K1MdLb5KuV5JdkyD7r8WJXXv5aD
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDr3lh4tg7NfAHVqbHGCl7z1xqpJrlsy9GroQGzPqqhUZoSUzEpLTia7mFOGTkU3wwGaWmgSVJctHRDjBh64t0w=
                                             create=True mode=0644 path=/tmp/ansible.8zgcolj3 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:25 compute-0 sudo[74842]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:25 compute-0 sudo[74994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prfsetqzmwraygwgicmgbmiiiohzydzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718225.3192167-94-190502269155115/AnsiballZ_command.py'
Dec 02 23:30:25 compute-0 sudo[74994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:26 compute-0 python3.9[74996]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.8zgcolj3' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:30:26 compute-0 sudo[74994]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:26 compute-0 sudo[75148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auwpkaqcuwivwvzsbitezxyghbfmicff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718226.2213356-110-200166469612127/AnsiballZ_file.py'
Dec 02 23:30:26 compute-0 sudo[75148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:26 compute-0 python3.9[75150]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.8zgcolj3 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:26 compute-0 sudo[75148]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:27 compute-0 sshd-session[74234]: Connection closed by 192.168.122.30 port 34868
Dec 02 23:30:27 compute-0 sshd-session[74231]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:30:27 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Dec 02 23:30:27 compute-0 systemd[1]: session-16.scope: Consumed 3.563s CPU time.
Dec 02 23:30:27 compute-0 systemd-logind[795]: Session 16 logged out. Waiting for processes to exit.
Dec 02 23:30:27 compute-0 systemd-logind[795]: Removed session 16.
Dec 02 23:30:33 compute-0 sshd-session[75175]: Accepted publickey for zuul from 192.168.122.30 port 48206 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:30:33 compute-0 systemd-logind[795]: New session 17 of user zuul.
Dec 02 23:30:33 compute-0 systemd[1]: Started Session 17 of User zuul.
Dec 02 23:30:33 compute-0 sshd-session[75175]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:30:34 compute-0 python3.9[75328]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:30:35 compute-0 sudo[75482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpdoidvlhcfyvszzbvjjiuotvyqtnivc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718234.5917974-44-95211794012870/AnsiballZ_systemd.py'
Dec 02 23:30:35 compute-0 sudo[75482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:35 compute-0 python3.9[75484]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 02 23:30:35 compute-0 sudo[75482]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:36 compute-0 sudo[75636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmexlogwnklwkzngoanrdfqumxfzikzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718235.7763305-60-128391850548627/AnsiballZ_systemd.py'
Dec 02 23:30:36 compute-0 sudo[75636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:36 compute-0 python3.9[75638]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:30:36 compute-0 sudo[75636]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:37 compute-0 sudo[75789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbjrhosqylaglevpxewfxpqrwflqgtvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718236.7168856-78-138827338774290/AnsiballZ_command.py'
Dec 02 23:30:37 compute-0 sudo[75789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:37 compute-0 python3.9[75791]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:30:37 compute-0 sudo[75789]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:38 compute-0 sudo[75942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdnmxpqcbtudkttyjdotenfaqmivcrjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718237.6085129-94-2823934548996/AnsiballZ_stat.py'
Dec 02 23:30:38 compute-0 sudo[75942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:38 compute-0 python3.9[75944]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:30:38 compute-0 sudo[75942]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:38 compute-0 sudo[76096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yglmaxprxemnxgeodwljayvmoknlfiyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718238.386309-110-155178467899517/AnsiballZ_command.py'
Dec 02 23:30:38 compute-0 sudo[76096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:38 compute-0 python3.9[76098]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:30:38 compute-0 sudo[76096]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:39 compute-0 sudo[76251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iukgzxzmkuoctzqxacrixgumxsaqbazq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718239.157871-126-242090272283737/AnsiballZ_file.py'
Dec 02 23:30:39 compute-0 sudo[76251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:39 compute-0 python3.9[76253]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:30:39 compute-0 sudo[76251]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:40 compute-0 sshd-session[75178]: Connection closed by 192.168.122.30 port 48206
Dec 02 23:30:40 compute-0 sshd-session[75175]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:30:40 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Dec 02 23:30:40 compute-0 systemd[1]: session-17.scope: Consumed 4.130s CPU time.
Dec 02 23:30:40 compute-0 systemd-logind[795]: Session 17 logged out. Waiting for processes to exit.
Dec 02 23:30:40 compute-0 systemd-logind[795]: Removed session 17.
Dec 02 23:30:45 compute-0 sshd-session[76279]: Accepted publickey for zuul from 192.168.122.30 port 54016 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:30:45 compute-0 systemd-logind[795]: New session 18 of user zuul.
Dec 02 23:30:45 compute-0 systemd[1]: Started Session 18 of User zuul.
Dec 02 23:30:45 compute-0 sshd-session[76279]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:30:46 compute-0 python3.9[76432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:30:47 compute-0 sudo[76586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yquffuzzlhxaxngmxxlbcgksgkzsdiqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718246.966693-48-182625289322164/AnsiballZ_setup.py'
Dec 02 23:30:47 compute-0 sudo[76586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:47 compute-0 python3.9[76588]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:30:47 compute-0 sudo[76586]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:48 compute-0 sudo[76670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dswccxotsvriraclcuvctftticcktsqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718246.966693-48-182625289322164/AnsiballZ_dnf.py'
Dec 02 23:30:48 compute-0 sudo[76670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:30:48 compute-0 python3.9[76672]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 02 23:30:49 compute-0 sudo[76670]: pam_unix(sudo:session): session closed for user root
Dec 02 23:30:50 compute-0 python3.9[76823]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:30:51 compute-0 python3.9[76976]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 23:30:51 compute-0 sshd-session[76824]: Received disconnect from 45.78.218.154 port 50812:11: Bye Bye [preauth]
Dec 02 23:30:51 compute-0 sshd-session[76824]: Disconnected from authenticating user root 45.78.218.154 port 50812 [preauth]
Dec 02 23:30:52 compute-0 python3.9[77126]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:30:53 compute-0 python3.9[77276]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:30:53 compute-0 sshd-session[76282]: Connection closed by 192.168.122.30 port 54016
Dec 02 23:30:53 compute-0 sshd-session[76279]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:30:53 compute-0 systemd-logind[795]: Session 18 logged out. Waiting for processes to exit.
Dec 02 23:30:53 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Dec 02 23:30:53 compute-0 systemd[1]: session-18.scope: Consumed 5.902s CPU time.
Dec 02 23:30:53 compute-0 systemd-logind[795]: Removed session 18.
Dec 02 23:30:59 compute-0 sshd-session[77302]: Accepted publickey for zuul from 192.168.122.30 port 50584 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:30:59 compute-0 systemd-logind[795]: New session 19 of user zuul.
Dec 02 23:30:59 compute-0 systemd[1]: Started Session 19 of User zuul.
Dec 02 23:30:59 compute-0 sshd-session[77302]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:31:00 compute-0 python3.9[77455]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:31:02 compute-0 sudo[77609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usumfojguerkrmmqkagobllscxqgwjxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718261.9160848-83-191489115140829/AnsiballZ_file.py'
Dec 02 23:31:02 compute-0 sudo[77609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:02 compute-0 python3.9[77611]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:02 compute-0 sudo[77609]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:02 compute-0 sudo[77761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reucncczvfhljdycbldzudnvsurdquji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718262.6642206-83-136471539144640/AnsiballZ_file.py'
Dec 02 23:31:02 compute-0 sudo[77761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:03 compute-0 python3.9[77763]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:03 compute-0 sudo[77761]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:03 compute-0 sudo[77913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txfmzgyhxmilefolbluthtypmouvyirl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718263.4028518-113-196922701640339/AnsiballZ_stat.py'
Dec 02 23:31:03 compute-0 sudo[77913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:03 compute-0 python3.9[77915]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:03 compute-0 sudo[77913]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:04 compute-0 sudo[78036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onqzthwiixrlvuvzbywotgvsqcysppel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718263.4028518-113-196922701640339/AnsiballZ_copy.py'
Dec 02 23:31:04 compute-0 sudo[78036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:04 compute-0 python3.9[78038]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718263.4028518-113-196922701640339/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=d3bc689386838233e4c282e0a8f678aadfddba48 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:04 compute-0 sudo[78036]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:05 compute-0 sudo[78188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skfpsltcahbluxoekvazjnkbyutyaxwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718264.8957345-113-98186471062820/AnsiballZ_stat.py'
Dec 02 23:31:05 compute-0 sudo[78188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:05 compute-0 python3.9[78190]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:05 compute-0 sudo[78188]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:05 compute-0 sudo[78311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycwjfeivevrstzzffxcrwibzizydbuwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718264.8957345-113-98186471062820/AnsiballZ_copy.py'
Dec 02 23:31:05 compute-0 sudo[78311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:06 compute-0 python3.9[78313]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718264.8957345-113-98186471062820/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=6067aac5ab79e06195616248c156299111d0656b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:06 compute-0 sudo[78311]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:06 compute-0 sudo[78463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hojlklzyjvdehevrmssnluoahihpstmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718266.3075247-113-177541571915280/AnsiballZ_stat.py'
Dec 02 23:31:06 compute-0 sudo[78463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:06 compute-0 python3.9[78465]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:06 compute-0 sudo[78463]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:06 compute-0 sudo[78586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyomavcbrzmrkseaqwjbgfajoyskjafs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718266.3075247-113-177541571915280/AnsiballZ_copy.py'
Dec 02 23:31:06 compute-0 sudo[78586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:07 compute-0 python3.9[78588]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718266.3075247-113-177541571915280/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=5ca3a278cc79a4efcf153935f67662727c069ef0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:07 compute-0 sudo[78586]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:07 compute-0 sudo[78738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arvixslgqlylkulzklhbxgebxsggesmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718267.4224203-196-169355716640750/AnsiballZ_file.py'
Dec 02 23:31:07 compute-0 sudo[78738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:07 compute-0 python3.9[78740]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:07 compute-0 sudo[78738]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:08 compute-0 sudo[78890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwphoduutovhxksjpkcuwlqgrqwentfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718267.982531-196-203417747681985/AnsiballZ_file.py'
Dec 02 23:31:08 compute-0 sudo[78890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:08 compute-0 python3.9[78892]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:08 compute-0 sudo[78890]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:08 compute-0 sudo[79042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lehmuolcojwydunsvmfrjigyeblydfbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718268.6508756-227-138746010260870/AnsiballZ_stat.py'
Dec 02 23:31:08 compute-0 sudo[79042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:09 compute-0 python3.9[79044]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:09 compute-0 sudo[79042]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:09 compute-0 sudo[79165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hssmakdftyyqhdrchjrvrfzugarbroop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718268.6508756-227-138746010260870/AnsiballZ_copy.py'
Dec 02 23:31:09 compute-0 sudo[79165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:09 compute-0 python3.9[79167]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718268.6508756-227-138746010260870/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=299133164379db07c5faf58a7df7764619f68999 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:09 compute-0 sudo[79165]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:10 compute-0 sudo[79317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erajhmgikpopryuakwjmydpxhfgozzjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718269.777633-227-117774474183080/AnsiballZ_stat.py'
Dec 02 23:31:10 compute-0 sudo[79317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:10 compute-0 python3.9[79319]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:10 compute-0 sudo[79317]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:10 compute-0 sudo[79440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sodkyydmlmhnmcnwkniocflehcyssutd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718269.777633-227-117774474183080/AnsiballZ_copy.py'
Dec 02 23:31:10 compute-0 sudo[79440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:10 compute-0 python3.9[79442]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718269.777633-227-117774474183080/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=b65b6f9d57bc766f19cb07712d4556c236316680 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:11 compute-0 sudo[79440]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:11 compute-0 sudo[79592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqndmckrmhgkidetpkxweedsiffvunjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718271.1311982-227-10611232115413/AnsiballZ_stat.py'
Dec 02 23:31:11 compute-0 sudo[79592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:11 compute-0 python3.9[79594]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:11 compute-0 sudo[79592]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:12 compute-0 sudo[79715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bebddznegotjtolzcelspmkteofwkkck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718271.1311982-227-10611232115413/AnsiballZ_copy.py'
Dec 02 23:31:12 compute-0 sudo[79715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:12 compute-0 python3.9[79717]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718271.1311982-227-10611232115413/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=3257ae6e0bb3110a03b7614eb5f86eff2ffdbd02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:12 compute-0 sudo[79715]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:12 compute-0 sudo[79867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfvrtoxnptgqcjrphwdnroviagfmodos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718272.4764183-315-117046155558632/AnsiballZ_file.py'
Dec 02 23:31:12 compute-0 sudo[79867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:12 compute-0 python3.9[79869]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:13 compute-0 sudo[79867]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:13 compute-0 sudo[80019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krmezzckbfnvuqvvuerkzbptegmikozi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718273.1522467-315-179090597830809/AnsiballZ_file.py'
Dec 02 23:31:13 compute-0 sudo[80019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:13 compute-0 python3.9[80021]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:13 compute-0 sudo[80019]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:14 compute-0 sudo[80171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmkkdqpzwjxiaryuiiuhefzdohanetbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718273.8857586-348-142737876396313/AnsiballZ_stat.py'
Dec 02 23:31:14 compute-0 sudo[80171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:14 compute-0 python3.9[80173]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:14 compute-0 sudo[80171]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:14 compute-0 sudo[80294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yletbkoxoeccyddyjstuldfskgqkqcmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718273.8857586-348-142737876396313/AnsiballZ_copy.py'
Dec 02 23:31:14 compute-0 sudo[80294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:14 compute-0 python3.9[80296]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718273.8857586-348-142737876396313/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=29a99d93e6dff1f9d751b3adb3484ca416f0b98f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:14 compute-0 sudo[80294]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:15 compute-0 sudo[80446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxmmauawydyftixyxaupxqpuryugiyfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718274.9794855-348-153397368696192/AnsiballZ_stat.py'
Dec 02 23:31:15 compute-0 sudo[80446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:15 compute-0 python3.9[80448]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:15 compute-0 sudo[80446]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:15 compute-0 sudo[80569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shtwrtuyltnjhaysptzaptqzgpfriewx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718274.9794855-348-153397368696192/AnsiballZ_copy.py'
Dec 02 23:31:15 compute-0 sudo[80569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:16 compute-0 python3.9[80571]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718274.9794855-348-153397368696192/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=41a15d23d08d17b0ccb97c2ef18e9b5ee7bff7e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:16 compute-0 sudo[80569]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:16 compute-0 sudo[80721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okajgaxxhfopdjulrskbpcszqvaftspg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718276.309853-348-17546108332045/AnsiballZ_stat.py'
Dec 02 23:31:16 compute-0 sudo[80721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:16 compute-0 python3.9[80723]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:16 compute-0 sudo[80721]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:17 compute-0 sudo[80844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffmjkihtrkkmespvjhufrybmvouthczx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718276.309853-348-17546108332045/AnsiballZ_copy.py'
Dec 02 23:31:17 compute-0 sudo[80844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:17 compute-0 python3.9[80846]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718276.309853-348-17546108332045/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=7f0d67ecbc13501084c493d148478bebe306b783 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:17 compute-0 sudo[80844]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:17 compute-0 sudo[80996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woymlzbcmpiljlqcyautdnxbmskfkrom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718277.6180923-430-587900535097/AnsiballZ_file.py'
Dec 02 23:31:17 compute-0 sudo[80996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:18 compute-0 python3.9[80998]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:18 compute-0 sudo[80996]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:18 compute-0 sudo[81148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpdosracluxxerjswpoefjfkmrjgftzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718278.2473087-430-6684500282206/AnsiballZ_file.py'
Dec 02 23:31:18 compute-0 sudo[81148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:18 compute-0 python3.9[81150]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:18 compute-0 sudo[81148]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:19 compute-0 sudo[81300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlukxjhsbcoccdypphpghignqbhbanzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718278.9224653-456-8503852820505/AnsiballZ_stat.py'
Dec 02 23:31:19 compute-0 sudo[81300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:19 compute-0 python3.9[81302]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:19 compute-0 sudo[81300]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:19 compute-0 sudo[81423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzcledvcebqxajrnuquohifssvgrxmnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718278.9224653-456-8503852820505/AnsiballZ_copy.py'
Dec 02 23:31:19 compute-0 sudo[81423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:19 compute-0 python3.9[81425]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718278.9224653-456-8503852820505/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=eba162621d782a2afaec63bcd40cfa6bee9d2808 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:19 compute-0 sudo[81423]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:20 compute-0 sudo[81575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwmxlljhlajhytldmdzehqtytcmwquzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718280.0815587-456-1931938383830/AnsiballZ_stat.py'
Dec 02 23:31:20 compute-0 sudo[81575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:20 compute-0 python3.9[81577]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:20 compute-0 sudo[81575]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:21 compute-0 chronyd[65014]: Selected source 23.159.16.194 (pool.ntp.org)
Dec 02 23:31:21 compute-0 sudo[81698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzgfvjjarwmsjajvqmkdwmnfnsjsdylz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718280.0815587-456-1931938383830/AnsiballZ_copy.py'
Dec 02 23:31:21 compute-0 sudo[81698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:21 compute-0 python3.9[81700]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718280.0815587-456-1931938383830/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=41a15d23d08d17b0ccb97c2ef18e9b5ee7bff7e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:21 compute-0 sudo[81698]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:21 compute-0 sudo[81850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkxyuqrtnnuidinjftcbhmafsbruekvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718281.5986264-456-173009967800225/AnsiballZ_stat.py'
Dec 02 23:31:21 compute-0 sudo[81850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:22 compute-0 python3.9[81852]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:22 compute-0 sudo[81850]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:22 compute-0 sudo[81973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jidaveenceufipfvfqafnylgxaejeqcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718281.5986264-456-173009967800225/AnsiballZ_copy.py'
Dec 02 23:31:22 compute-0 sudo[81973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:22 compute-0 python3.9[81975]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718281.5986264-456-173009967800225/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=41a361e5f12a19c8b91339c0674b50fcba543f5f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:22 compute-0 sudo[81973]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:23 compute-0 sudo[82125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmdetskbwoqipywlnslutrwqxfazwclo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718283.5355914-566-59369485345645/AnsiballZ_file.py'
Dec 02 23:31:23 compute-0 sudo[82125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:24 compute-0 python3.9[82127]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:24 compute-0 sudo[82125]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:24 compute-0 sudo[82277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mntpxiuqycfdctfzwjmdbjuahydpkncy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718284.2379928-593-189966691749180/AnsiballZ_stat.py'
Dec 02 23:31:24 compute-0 sudo[82277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:24 compute-0 python3.9[82279]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:24 compute-0 sudo[82277]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:25 compute-0 sudo[82400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnglbxrqxkviznfyhcgwwoohrlpavupr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718284.2379928-593-189966691749180/AnsiballZ_copy.py'
Dec 02 23:31:25 compute-0 sudo[82400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:25 compute-0 python3.9[82402]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718284.2379928-593-189966691749180/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:25 compute-0 sudo[82400]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:25 compute-0 sudo[82552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwvjhpuupflgincoewvrisrbsmjlxktx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718285.4762688-623-100399634335423/AnsiballZ_file.py'
Dec 02 23:31:25 compute-0 sudo[82552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:25 compute-0 python3.9[82554]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:25 compute-0 sudo[82552]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:26 compute-0 sudo[82704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugkblqgusbmviatzlklyjouioabtwpel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718286.1330867-638-252460246336029/AnsiballZ_stat.py'
Dec 02 23:31:26 compute-0 sudo[82704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:26 compute-0 python3.9[82706]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:26 compute-0 sudo[82704]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:27 compute-0 sudo[82827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsmqxnanowhicbovjztxhwozhmllimrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718286.1330867-638-252460246336029/AnsiballZ_copy.py'
Dec 02 23:31:27 compute-0 sudo[82827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:27 compute-0 python3.9[82829]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718286.1330867-638-252460246336029/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:27 compute-0 sudo[82827]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:27 compute-0 sudo[82979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isxtczjuzccfegcsuqqmhwdfyekaoxay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718287.4999201-670-142950561885446/AnsiballZ_file.py'
Dec 02 23:31:27 compute-0 sudo[82979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:28 compute-0 python3.9[82981]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:28 compute-0 sudo[82979]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:28 compute-0 sudo[83131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uicijshcwxtbuqvcavltwhpgbkhkqrkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718288.3217607-685-244952045996311/AnsiballZ_stat.py'
Dec 02 23:31:28 compute-0 sudo[83131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:29 compute-0 python3.9[83133]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:29 compute-0 sudo[83131]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:29 compute-0 sudo[83254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zacbkauuepmatbcrzikpwdbxkconndzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718288.3217607-685-244952045996311/AnsiballZ_copy.py'
Dec 02 23:31:29 compute-0 sudo[83254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:29 compute-0 python3.9[83256]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718288.3217607-685-244952045996311/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:29 compute-0 sudo[83254]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:30 compute-0 sudo[83406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcjshlmwpjbyxxwkpksjvksgezvrnbgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718289.8792148-717-162568397785434/AnsiballZ_file.py'
Dec 02 23:31:30 compute-0 sudo[83406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:30 compute-0 python3.9[83408]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:30 compute-0 sudo[83406]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:30 compute-0 sudo[83558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thyvbojnjhygwyokkpdtjfcsacorkrfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718290.5802758-733-49462625957994/AnsiballZ_stat.py'
Dec 02 23:31:30 compute-0 sudo[83558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:31 compute-0 python3.9[83560]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:31 compute-0 sudo[83558]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:31 compute-0 sudo[83681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tczensqeduvoyxadxpbetcihfvrnvzxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718290.5802758-733-49462625957994/AnsiballZ_copy.py'
Dec 02 23:31:31 compute-0 sudo[83681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:31 compute-0 python3.9[83683]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718290.5802758-733-49462625957994/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:31 compute-0 sudo[83681]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:32 compute-0 sudo[83833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sybqmelqkzavzjwtahyiqyzjkgjhdngj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718291.89905-762-5829572864488/AnsiballZ_file.py'
Dec 02 23:31:32 compute-0 sudo[83833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:32 compute-0 python3.9[83835]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:32 compute-0 sudo[83833]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:32 compute-0 sudo[83985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfzdgfgejotjayjrgoatiufapkcdtspb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718292.5809114-777-56721165887230/AnsiballZ_stat.py'
Dec 02 23:31:32 compute-0 sudo[83985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:33 compute-0 python3.9[83987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:33 compute-0 sudo[83985]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:33 compute-0 sudo[84108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpfmxccppjzouozjgilxwustbkjhznmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718292.5809114-777-56721165887230/AnsiballZ_copy.py'
Dec 02 23:31:33 compute-0 sudo[84108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:33 compute-0 python3.9[84110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718292.5809114-777-56721165887230/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:33 compute-0 sudo[84108]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:34 compute-0 sudo[84260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwrgalllqfbqnzossfjwkuoubzqosipu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718293.9705157-807-269696506711847/AnsiballZ_file.py'
Dec 02 23:31:34 compute-0 sudo[84260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:34 compute-0 python3.9[84262]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:34 compute-0 sudo[84260]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:35 compute-0 sudo[84412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbewdlpiauqvlmkrmfovwnskxbwemson ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718294.7139983-823-271464492468787/AnsiballZ_stat.py'
Dec 02 23:31:35 compute-0 sudo[84412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:35 compute-0 python3.9[84414]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:35 compute-0 sudo[84412]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:35 compute-0 sudo[84535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chjmkeeqljhrboqkyapkvalzcapbgmdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718294.7139983-823-271464492468787/AnsiballZ_copy.py'
Dec 02 23:31:35 compute-0 sudo[84535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:35 compute-0 python3.9[84537]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718294.7139983-823-271464492468787/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:35 compute-0 sudo[84535]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:36 compute-0 sudo[84687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xccsdsziflushjnzevsbmxbmothyhapn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718296.2662983-855-185395074580046/AnsiballZ_file.py'
Dec 02 23:31:36 compute-0 sudo[84687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:36 compute-0 python3.9[84689]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:36 compute-0 sudo[84687]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:37 compute-0 sudo[84839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vomvjgrddarifzcypibvzvfzqiiczhpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718297.0418-871-167364877733428/AnsiballZ_stat.py'
Dec 02 23:31:37 compute-0 sudo[84839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:37 compute-0 python3.9[84841]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:31:37 compute-0 sudo[84839]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:37 compute-0 sudo[84962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fifjcsqkspwhullzcoctrugagmzfypla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718297.0418-871-167364877733428/AnsiballZ_copy.py'
Dec 02 23:31:37 compute-0 sudo[84962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:38 compute-0 python3.9[84964]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718297.0418-871-167364877733428/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f36371576aa58707fc1d2b8554f71ab3575c4735 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:38 compute-0 sudo[84962]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:40 compute-0 sshd-session[77305]: Connection closed by 192.168.122.30 port 50584
Dec 02 23:31:40 compute-0 sshd-session[77302]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:31:40 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Dec 02 23:31:40 compute-0 systemd[1]: session-19.scope: Consumed 29.258s CPU time.
Dec 02 23:31:40 compute-0 systemd-logind[795]: Session 19 logged out. Waiting for processes to exit.
Dec 02 23:31:40 compute-0 systemd-logind[795]: Removed session 19.
Dec 02 23:31:46 compute-0 sshd-session[84989]: Accepted publickey for zuul from 192.168.122.30 port 41134 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:31:46 compute-0 systemd-logind[795]: New session 20 of user zuul.
Dec 02 23:31:46 compute-0 systemd[1]: Started Session 20 of User zuul.
Dec 02 23:31:46 compute-0 sshd-session[84989]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:31:47 compute-0 python3.9[85142]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:31:48 compute-0 sudo[85296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlzkgyobwxlxwsxoqgnlujsxyagulgqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718307.8935342-48-240956967887637/AnsiballZ_file.py'
Dec 02 23:31:48 compute-0 sudo[85296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:48 compute-0 python3.9[85298]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:48 compute-0 sudo[85296]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:49 compute-0 sudo[85448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brwamnuythhbqavmvvvljxhvzajgajwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718308.9144225-48-227457022934835/AnsiballZ_file.py'
Dec 02 23:31:49 compute-0 sudo[85448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:49 compute-0 python3.9[85450]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:31:49 compute-0 sudo[85448]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:50 compute-0 python3.9[85600]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:31:51 compute-0 sudo[85750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szbqexfdhxnpwusqsdtdoevvoymrwfvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718310.9277036-94-83751525182364/AnsiballZ_seboolean.py'
Dec 02 23:31:51 compute-0 sudo[85750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:51 compute-0 python3.9[85752]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 02 23:31:52 compute-0 sudo[85750]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:53 compute-0 sudo[85906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgexftiocqazkehgfxfzrddeptnzuxxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718313.0727885-114-142686930136551/AnsiballZ_setup.py'
Dec 02 23:31:53 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 02 23:31:53 compute-0 sudo[85906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:53 compute-0 python3.9[85908]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:31:53 compute-0 sudo[85906]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:54 compute-0 sudo[85990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgvnrpnpizzcgokijnzrocgipzaitzna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718313.0727885-114-142686930136551/AnsiballZ_dnf.py'
Dec 02 23:31:54 compute-0 sudo[85990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:54 compute-0 python3.9[85992]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:31:55 compute-0 sudo[85990]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:56 compute-0 sudo[86143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzqaobzqxmqmygbykigyocniavxghobw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718316.167991-138-133315110627401/AnsiballZ_systemd.py'
Dec 02 23:31:56 compute-0 sudo[86143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:57 compute-0 python3.9[86145]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:31:57 compute-0 sudo[86143]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:57 compute-0 sudo[86298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvwayxpbryyznjypspssymtgoexgpqbn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718317.4549484-154-28705362665060/AnsiballZ_edpm_nftables_snippet.py'
Dec 02 23:31:57 compute-0 sudo[86298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:58 compute-0 python3[86300]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 02 23:31:58 compute-0 sudo[86298]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:58 compute-0 sudo[86450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdyxhjjsgphscriekdaayrkovewfayoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718318.4813862-172-1491197940252/AnsiballZ_file.py'
Dec 02 23:31:58 compute-0 sudo[86450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:58 compute-0 python3.9[86452]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:31:58 compute-0 sudo[86450]: pam_unix(sudo:session): session closed for user root
Dec 02 23:31:59 compute-0 sudo[86602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhzwpimglpzpaalylezstwoobtmgoqop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718319.2749662-188-37532553103240/AnsiballZ_stat.py'
Dec 02 23:31:59 compute-0 sudo[86602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:31:59 compute-0 python3.9[86604]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:00 compute-0 sudo[86602]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:00 compute-0 sudo[86680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fubhkguzuzrwhnsfbysxcbvudixjooow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718319.2749662-188-37532553103240/AnsiballZ_file.py'
Dec 02 23:32:00 compute-0 sudo[86680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:00 compute-0 python3.9[86682]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:00 compute-0 sudo[86680]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:01 compute-0 sudo[86832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlyffaudvhbkemsdjbvucntvnwmbffez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718320.9120622-212-158305893064582/AnsiballZ_stat.py'
Dec 02 23:32:01 compute-0 sudo[86832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:01 compute-0 python3.9[86834]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:01 compute-0 sudo[86832]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:01 compute-0 sudo[86910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgfasaiudcyipukwhezqtvhrnowlnaok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718320.9120622-212-158305893064582/AnsiballZ_file.py'
Dec 02 23:32:01 compute-0 sudo[86910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:01 compute-0 python3.9[86912]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.jhurg2h_ recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:01 compute-0 sudo[86910]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:02 compute-0 sudo[87062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcoadewrgrfygpwuqbimqopypjxguxbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718322.304415-236-243450833374162/AnsiballZ_stat.py'
Dec 02 23:32:02 compute-0 sudo[87062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:02 compute-0 python3.9[87064]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:02 compute-0 sudo[87062]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:03 compute-0 sudo[87140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxcaukcqihpldjrrcpjpafkqjollanot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718322.304415-236-243450833374162/AnsiballZ_file.py'
Dec 02 23:32:03 compute-0 sudo[87140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:03 compute-0 python3.9[87142]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:03 compute-0 sudo[87140]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:04 compute-0 sudo[87292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfakoealhkkoovoevxynylvbuiyifkmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718323.629619-262-178061654855244/AnsiballZ_command.py'
Dec 02 23:32:04 compute-0 sudo[87292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:04 compute-0 python3.9[87294]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:04 compute-0 sudo[87292]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:05 compute-0 sudo[87445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypvrcbhsvqcxcxnipfhbdkxmrqixcmdw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718324.6428192-278-59220643544776/AnsiballZ_edpm_nftables_from_files.py'
Dec 02 23:32:05 compute-0 sudo[87445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:05 compute-0 python3[87447]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 02 23:32:05 compute-0 sudo[87445]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:05 compute-0 sudo[87597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lobamxvxvofvrwrazgvtztatcgpniong ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718325.5958145-294-138904543103572/AnsiballZ_stat.py'
Dec 02 23:32:05 compute-0 sudo[87597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:06 compute-0 python3.9[87599]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:06 compute-0 sudo[87597]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:06 compute-0 sudo[87722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anniukirofzlepnsilzpbtgrzvjqoqww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718325.5958145-294-138904543103572/AnsiballZ_copy.py'
Dec 02 23:32:06 compute-0 sudo[87722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:06 compute-0 python3.9[87724]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718325.5958145-294-138904543103572/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:07 compute-0 sudo[87722]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:07 compute-0 sudo[87874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjafpxkjkwddxcfxgyzpdbrufqkphxwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718327.3095222-324-107990439725891/AnsiballZ_stat.py'
Dec 02 23:32:07 compute-0 sudo[87874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:07 compute-0 python3.9[87876]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:07 compute-0 sudo[87874]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:08 compute-0 sudo[87999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsiqfzlfzpxmirsznuawrufklgmonupf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718327.3095222-324-107990439725891/AnsiballZ_copy.py'
Dec 02 23:32:08 compute-0 sudo[87999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:08 compute-0 python3.9[88001]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718327.3095222-324-107990439725891/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:08 compute-0 sudo[87999]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:09 compute-0 sudo[88151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqbncifmyyqaqtmzjsgjgmzsevzuhozg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718328.8108356-354-49691059990500/AnsiballZ_stat.py'
Dec 02 23:32:09 compute-0 sudo[88151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:09 compute-0 python3.9[88153]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:09 compute-0 sudo[88151]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:09 compute-0 sudo[88276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsgvvcqceztulfnlfbdajfixbahcltlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718328.8108356-354-49691059990500/AnsiballZ_copy.py'
Dec 02 23:32:09 compute-0 sudo[88276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:09 compute-0 python3.9[88278]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718328.8108356-354-49691059990500/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:09 compute-0 sudo[88276]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:10 compute-0 sudo[88428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrmjhhwyrydiiurcimowkjttlznccciq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718330.32692-384-4833916068156/AnsiballZ_stat.py'
Dec 02 23:32:10 compute-0 sudo[88428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:10 compute-0 python3.9[88430]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:10 compute-0 sudo[88428]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:11 compute-0 sudo[88553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgajhkmzlekffirhrotkwirojqgrwuae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718330.32692-384-4833916068156/AnsiballZ_copy.py'
Dec 02 23:32:11 compute-0 sudo[88553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:11 compute-0 python3.9[88555]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718330.32692-384-4833916068156/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:11 compute-0 sudo[88553]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:12 compute-0 sudo[88705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqsyxjqbpmxmwdlcqqlpatvipintafcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718331.814524-414-158309509278267/AnsiballZ_stat.py'
Dec 02 23:32:12 compute-0 sudo[88705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:12 compute-0 python3.9[88707]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:12 compute-0 sudo[88705]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:12 compute-0 sudo[88830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaysjmwadkiidqaigredseespvxfdqvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718331.814524-414-158309509278267/AnsiballZ_copy.py'
Dec 02 23:32:12 compute-0 sudo[88830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:13 compute-0 python3.9[88832]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718331.814524-414-158309509278267/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:13 compute-0 sudo[88830]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:13 compute-0 sudo[88982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxqmdvgheejahhtouyvxnysxsdlwydrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718333.4661496-444-103998144045497/AnsiballZ_file.py'
Dec 02 23:32:13 compute-0 sudo[88982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:14 compute-0 python3.9[88984]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:14 compute-0 sudo[88982]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:14 compute-0 sudo[89134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptyqnilxqdfusptkhctgxfxwmzajjwmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718334.2688746-460-12104255454067/AnsiballZ_command.py'
Dec 02 23:32:14 compute-0 sudo[89134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:14 compute-0 python3.9[89136]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:14 compute-0 sudo[89134]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:15 compute-0 sudo[89289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjcjsqtxjjpreyugsldlysqbqxuizrmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718335.2115228-476-150143631794234/AnsiballZ_blockinfile.py'
Dec 02 23:32:15 compute-0 sudo[89289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:15 compute-0 python3.9[89291]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:15 compute-0 sudo[89289]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:16 compute-0 sudo[89441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqhojylwlbraywhkogzdzmqckrruaktj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718336.3028517-494-168953086152952/AnsiballZ_command.py'
Dec 02 23:32:16 compute-0 sudo[89441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:16 compute-0 python3.9[89443]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:16 compute-0 sudo[89441]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:17 compute-0 sudo[89594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbjjuyvjregfjvbgzehqmgjafkpvrnuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718337.138786-510-195996221600134/AnsiballZ_stat.py'
Dec 02 23:32:17 compute-0 sudo[89594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:17 compute-0 python3.9[89596]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:32:17 compute-0 sudo[89594]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:18 compute-0 sudo[89748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzdpblgljeqpmxygnroqgpuxfuytkzaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718337.9116104-526-41155652400183/AnsiballZ_command.py'
Dec 02 23:32:18 compute-0 sudo[89748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:18 compute-0 python3.9[89750]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:18 compute-0 sudo[89748]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:19 compute-0 sudo[89903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejjitnvxgwxwqzaxknappwjuuftqepeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718338.771309-542-104770143212506/AnsiballZ_file.py'
Dec 02 23:32:19 compute-0 sudo[89903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:19 compute-0 python3.9[89905]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:19 compute-0 sudo[89903]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:20 compute-0 python3.9[90055]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:32:21 compute-0 sudo[90206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djmuixzoecokrvmjzviahootzyerngse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718341.5191212-622-26109742036228/AnsiballZ_command.py'
Dec 02 23:32:21 compute-0 sudo[90206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:22 compute-0 python3.9[90208]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:f2:93:49:d5" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:22 compute-0 ovs-vsctl[90209]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:f2:93:49:d5 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 02 23:32:22 compute-0 sudo[90206]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:22 compute-0 sudo[90359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztameqmsdawgfwvihohrpgdhornynyjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718342.4558773-640-119584228917762/AnsiballZ_command.py'
Dec 02 23:32:22 compute-0 sudo[90359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:23 compute-0 python3.9[90361]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:23 compute-0 sudo[90359]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:23 compute-0 sudo[90514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqqlfpxnralpajtnxfikivdkrptsnfsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718343.2252784-656-71988627717987/AnsiballZ_command.py'
Dec 02 23:32:23 compute-0 sudo[90514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:23 compute-0 python3.9[90516]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:23 compute-0 ovs-vsctl[90517]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 02 23:32:23 compute-0 sudo[90514]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:24 compute-0 python3.9[90667]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:32:25 compute-0 sudo[90819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-artuohadxnsfmyzdunyawnorahrdrlfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718344.975956-690-82209165152614/AnsiballZ_file.py'
Dec 02 23:32:25 compute-0 sudo[90819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:25 compute-0 python3.9[90821]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:32:25 compute-0 sudo[90819]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:26 compute-0 sudo[90971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxujoneodyvrozyfvaoyyfztfysakmrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718345.8211699-706-222196207750203/AnsiballZ_stat.py'
Dec 02 23:32:26 compute-0 sudo[90971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:26 compute-0 python3.9[90973]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:26 compute-0 sudo[90971]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:26 compute-0 sudo[91049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiseqtmgbffwmkxudbkzjqqoxwqjwwdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718345.8211699-706-222196207750203/AnsiballZ_file.py'
Dec 02 23:32:26 compute-0 sudo[91049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:26 compute-0 python3.9[91051]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:32:26 compute-0 sudo[91049]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:27 compute-0 sudo[91201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-falufiuvtdjzpiodsxikgmloiairkzvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718347.0792832-706-62787670720607/AnsiballZ_stat.py'
Dec 02 23:32:27 compute-0 sudo[91201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:27 compute-0 python3.9[91203]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:27 compute-0 sudo[91201]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:27 compute-0 sudo[91279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdxyqwbpwlvcykxqsfkndeumtstnsjfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718347.0792832-706-62787670720607/AnsiballZ_file.py'
Dec 02 23:32:27 compute-0 sudo[91279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:28 compute-0 python3.9[91281]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:32:28 compute-0 sudo[91279]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:28 compute-0 sudo[91431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uulehhqzjmtokivjwqiooiddcfuvprsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718348.38341-752-36501404926109/AnsiballZ_file.py'
Dec 02 23:32:28 compute-0 sudo[91431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:28 compute-0 python3.9[91433]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:28 compute-0 sudo[91431]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:29 compute-0 sudo[91583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwneazhdgmgdvrlwdubixnwvbefvicbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718349.2973287-768-30723897428043/AnsiballZ_stat.py'
Dec 02 23:32:29 compute-0 sudo[91583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:29 compute-0 python3.9[91585]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:29 compute-0 sudo[91583]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:30 compute-0 sudo[91661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynvrvfrnkxfcvokymycyoxjajlkpnkas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718349.2973287-768-30723897428043/AnsiballZ_file.py'
Dec 02 23:32:30 compute-0 sudo[91661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:30 compute-0 python3.9[91663]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:30 compute-0 sudo[91661]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:30 compute-0 sudo[91813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnqpxbvmrpwzhetkcqjexqmkqjkisrzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718350.5810094-792-268249182089381/AnsiballZ_stat.py'
Dec 02 23:32:30 compute-0 sudo[91813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:31 compute-0 python3.9[91815]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:31 compute-0 sudo[91813]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:31 compute-0 sudo[91891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jystfixisnuttutafdcbkpckgvivuwbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718350.5810094-792-268249182089381/AnsiballZ_file.py'
Dec 02 23:32:31 compute-0 sudo[91891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:31 compute-0 python3.9[91893]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:31 compute-0 sudo[91891]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:32 compute-0 sudo[92043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swthuhkwhciwukduoyhevyhjkynizbel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718351.9284742-816-139142873915627/AnsiballZ_systemd.py'
Dec 02 23:32:32 compute-0 sudo[92043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:32 compute-0 python3.9[92045]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:32:32 compute-0 systemd[1]: Reloading.
Dec 02 23:32:32 compute-0 systemd-sysv-generator[92077]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:32:32 compute-0 systemd-rc-local-generator[92074]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:32:32 compute-0 sudo[92043]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:33 compute-0 sudo[92233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibgcnafgsrnnqghymtnulskpwbjsokiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718353.215476-832-185947623661134/AnsiballZ_stat.py'
Dec 02 23:32:33 compute-0 sudo[92233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:33 compute-0 python3.9[92235]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:33 compute-0 sudo[92233]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:34 compute-0 sudo[92311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bndzimnkdbjecibfvbzthueflurabsve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718353.215476-832-185947623661134/AnsiballZ_file.py'
Dec 02 23:32:34 compute-0 sudo[92311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:34 compute-0 python3.9[92313]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:34 compute-0 sudo[92311]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:35 compute-0 sudo[92463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shvyqgelymdndgibjnuswaklwscelnsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718354.8209283-856-231877382944569/AnsiballZ_stat.py'
Dec 02 23:32:35 compute-0 sudo[92463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:35 compute-0 python3.9[92465]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:35 compute-0 sudo[92463]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:35 compute-0 sudo[92541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmgakozfekpoknwuwwgcnotpdscmmalz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718354.8209283-856-231877382944569/AnsiballZ_file.py'
Dec 02 23:32:35 compute-0 sudo[92541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:35 compute-0 python3.9[92543]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:35 compute-0 sudo[92541]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:36 compute-0 sudo[92693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzawoqyugjzujqpcuagfmmociqvtlfhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718356.1892672-880-75854466154640/AnsiballZ_systemd.py'
Dec 02 23:32:36 compute-0 sudo[92693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:36 compute-0 python3.9[92695]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:32:36 compute-0 systemd[1]: Reloading.
Dec 02 23:32:36 compute-0 systemd-sysv-generator[92727]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:32:37 compute-0 systemd-rc-local-generator[92720]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:32:38 compute-0 systemd[1]: Starting Create netns directory...
Dec 02 23:32:38 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 23:32:38 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 23:32:38 compute-0 systemd[1]: Finished Create netns directory.
Dec 02 23:32:38 compute-0 sudo[92693]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:38 compute-0 sudo[92887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtfseaqhxyrbddosnsogggeuupwrfmeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718358.5127385-900-146442108973421/AnsiballZ_file.py'
Dec 02 23:32:38 compute-0 sudo[92887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:39 compute-0 python3.9[92889]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:32:39 compute-0 sudo[92887]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:39 compute-0 sudo[93039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjavznhltxsaxcchdhejzlixwkuebmgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718359.3292673-916-169690809397452/AnsiballZ_stat.py'
Dec 02 23:32:39 compute-0 sudo[93039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:39 compute-0 python3.9[93041]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:39 compute-0 sudo[93039]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:40 compute-0 sudo[93162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvlyshqboihvpzigryhiucjjklnjewif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718359.3292673-916-169690809397452/AnsiballZ_copy.py'
Dec 02 23:32:40 compute-0 sudo[93162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:40 compute-0 python3.9[93164]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718359.3292673-916-169690809397452/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:32:40 compute-0 sudo[93162]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:41 compute-0 sudo[93314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pawkpgoiuzdqhkfuhqayjxojewvbecws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718360.908933-950-257214151804798/AnsiballZ_file.py'
Dec 02 23:32:41 compute-0 sudo[93314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:41 compute-0 python3.9[93316]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:32:41 compute-0 sudo[93314]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:41 compute-0 sudo[93466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayybtrddngqbepvkntyeybtexzctcenx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718361.6704414-966-42130960863433/AnsiballZ_stat.py'
Dec 02 23:32:41 compute-0 sudo[93466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:42 compute-0 python3.9[93468]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:32:42 compute-0 sudo[93466]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:42 compute-0 sudo[93589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyxjemxplrilfuyyegyzuhlfuvjbviut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718361.6704414-966-42130960863433/AnsiballZ_copy.py'
Dec 02 23:32:42 compute-0 sudo[93589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:42 compute-0 python3.9[93591]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718361.6704414-966-42130960863433/.source.json _original_basename=.9_spl23m follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:42 compute-0 sudo[93589]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:43 compute-0 sudo[93741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svugxgwyryzyjxrsvapbyeyeoqjqbron ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718363.1177578-996-237939546211643/AnsiballZ_file.py'
Dec 02 23:32:43 compute-0 sudo[93741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:43 compute-0 python3.9[93743]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:43 compute-0 sudo[93741]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:44 compute-0 sudo[93893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eukgroanfquukxqitvichokshsxhzgmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718363.966312-1012-81535593169731/AnsiballZ_stat.py'
Dec 02 23:32:44 compute-0 sudo[93893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:44 compute-0 sudo[93893]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:44 compute-0 sudo[94016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rblgnmichocrrduntfmhwtiwxlxmqrie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718363.966312-1012-81535593169731/AnsiballZ_copy.py'
Dec 02 23:32:44 compute-0 sudo[94016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:44 compute-0 sudo[94016]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:46 compute-0 sudo[94168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfkbruaxclsxlpecbecenmpdkztdcayn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718365.5550463-1046-214765964985681/AnsiballZ_container_config_data.py'
Dec 02 23:32:46 compute-0 sudo[94168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:46 compute-0 python3.9[94170]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 02 23:32:46 compute-0 sudo[94168]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:47 compute-0 sudo[94320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfqhvhquprolbogqjdfrwovznqzwsjge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718366.6129825-1064-33266447190919/AnsiballZ_container_config_hash.py'
Dec 02 23:32:47 compute-0 sudo[94320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:47 compute-0 python3.9[94322]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:32:47 compute-0 sudo[94320]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:48 compute-0 sudo[94472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egyehupnffodlzljbomaqxqpujdpfsdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718367.5839636-1082-243787687221203/AnsiballZ_podman_container_info.py'
Dec 02 23:32:48 compute-0 sudo[94472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:48 compute-0 python3.9[94474]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 02 23:32:48 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:32:48 compute-0 sudo[94472]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:49 compute-0 sudo[94635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joyqquxjktnorrmuhxkkavioucarfwyv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718369.3402147-1108-84522243272784/AnsiballZ_edpm_container_manage.py'
Dec 02 23:32:49 compute-0 sudo[94635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:49 compute-0 python3[94637]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:32:50 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:32:50 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:32:50 compute-0 podman[94673]: 2025-12-02 23:32:50.141757641 +0000 UTC m=+0.049354736 container create e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller)
Dec 02 23:32:50 compute-0 podman[94673]: 2025-12-02 23:32:50.115369708 +0000 UTC m=+0.022966833 image pull 78889ae0cf8c3740f43b6df72a2c4568ab589fb816614851d476abc277d3fffb 38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Dec 02 23:32:50 compute-0 python3[94637]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Dec 02 23:32:50 compute-0 sudo[94635]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:50 compute-0 sudo[94862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcspfbvlpsbiswzycicohuihkagofjoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718370.618295-1124-275312194476210/AnsiballZ_stat.py'
Dec 02 23:32:50 compute-0 sudo[94862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:50 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 02 23:32:51 compute-0 python3.9[94864]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:32:51 compute-0 sudo[94862]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:51 compute-0 sudo[95016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfsfkesrkdswsijrqhsmawrfvqhsmmis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718371.473883-1142-187069291825663/AnsiballZ_file.py'
Dec 02 23:32:51 compute-0 sudo[95016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:52 compute-0 python3.9[95018]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:52 compute-0 sudo[95016]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:52 compute-0 sudo[95092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkethgedcsiskvoivawavfikalgddhfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718371.473883-1142-187069291825663/AnsiballZ_stat.py'
Dec 02 23:32:52 compute-0 sudo[95092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:52 compute-0 python3.9[95094]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:32:52 compute-0 sudo[95092]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:53 compute-0 sudo[95243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytivrnvtxnmlpqsiacscwplsoiblxvnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718372.591989-1142-3723109838473/AnsiballZ_copy.py'
Dec 02 23:32:53 compute-0 sudo[95243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:53 compute-0 python3.9[95245]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764718372.591989-1142-3723109838473/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:32:53 compute-0 sudo[95243]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:53 compute-0 sudo[95319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siwzewnhhiwbefnhqjqunnnqmcwfllak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718372.591989-1142-3723109838473/AnsiballZ_systemd.py'
Dec 02 23:32:53 compute-0 sudo[95319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:53 compute-0 python3.9[95321]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:32:53 compute-0 systemd[1]: Reloading.
Dec 02 23:32:54 compute-0 systemd-rc-local-generator[95349]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:32:54 compute-0 systemd-sysv-generator[95353]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:32:54 compute-0 sudo[95319]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:54 compute-0 sudo[95430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlgodasifdzgttqhllwmwllteuofhmih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718372.591989-1142-3723109838473/AnsiballZ_systemd.py'
Dec 02 23:32:54 compute-0 sudo[95430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:54 compute-0 python3.9[95432]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:32:54 compute-0 systemd[1]: Reloading.
Dec 02 23:32:54 compute-0 systemd-rc-local-generator[95459]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:32:54 compute-0 systemd-sysv-generator[95463]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:32:55 compute-0 systemd[1]: Starting ovn_controller container...
Dec 02 23:32:55 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec 02 23:32:55 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:32:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66330cea78589c99283af03b161260eb9d35d6d346887fbb6cde0bee4dc3ef8d/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 02 23:32:55 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6.
Dec 02 23:32:55 compute-0 podman[95473]: 2025-12-02 23:32:55.293006249 +0000 UTC m=+0.166702810 container init e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 02 23:32:55 compute-0 ovn_controller[95488]: + sudo -E kolla_set_configs
Dec 02 23:32:55 compute-0 podman[95473]: 2025-12-02 23:32:55.323931985 +0000 UTC m=+0.197628536 container start e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2)
Dec 02 23:32:55 compute-0 edpm-start-podman-container[95473]: ovn_controller
Dec 02 23:32:55 compute-0 systemd[1]: Created slice User Slice of UID 0.
Dec 02 23:32:55 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 02 23:32:55 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 02 23:32:55 compute-0 systemd[1]: Starting User Manager for UID 0...
Dec 02 23:32:55 compute-0 systemd[95517]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Dec 02 23:32:55 compute-0 edpm-start-podman-container[95472]: Creating additional drop-in dependency for "ovn_controller" (e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6)
Dec 02 23:32:55 compute-0 podman[95495]: 2025-12-02 23:32:55.41879867 +0000 UTC m=+0.074782763 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:32:55 compute-0 systemd[1]: e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6-681645db4bf16794.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 23:32:55 compute-0 systemd[1]: e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6-681645db4bf16794.service: Failed with result 'exit-code'.
Dec 02 23:32:55 compute-0 systemd[1]: Reloading.
Dec 02 23:32:55 compute-0 systemd[95517]: Queued start job for default target Main User Target.
Dec 02 23:32:55 compute-0 systemd[95517]: Created slice User Application Slice.
Dec 02 23:32:55 compute-0 systemd[95517]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 02 23:32:55 compute-0 systemd[95517]: Started Daily Cleanup of User's Temporary Directories.
Dec 02 23:32:55 compute-0 systemd[95517]: Reached target Paths.
Dec 02 23:32:55 compute-0 systemd[95517]: Reached target Timers.
Dec 02 23:32:55 compute-0 systemd-rc-local-generator[95578]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:32:55 compute-0 systemd[95517]: Starting D-Bus User Message Bus Socket...
Dec 02 23:32:55 compute-0 systemd[95517]: Starting Create User's Volatile Files and Directories...
Dec 02 23:32:55 compute-0 systemd[95517]: Listening on D-Bus User Message Bus Socket.
Dec 02 23:32:55 compute-0 systemd[95517]: Reached target Sockets.
Dec 02 23:32:55 compute-0 systemd[95517]: Finished Create User's Volatile Files and Directories.
Dec 02 23:32:55 compute-0 systemd[95517]: Reached target Basic System.
Dec 02 23:32:55 compute-0 systemd[95517]: Reached target Main User Target.
Dec 02 23:32:55 compute-0 systemd[95517]: Startup finished in 140ms.
Dec 02 23:32:55 compute-0 systemd-sysv-generator[95581]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:32:55 compute-0 systemd[1]: Started User Manager for UID 0.
Dec 02 23:32:55 compute-0 systemd[1]: Started ovn_controller container.
Dec 02 23:32:55 compute-0 systemd[1]: Started Session c1 of User root.
Dec 02 23:32:55 compute-0 sudo[95430]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:55 compute-0 ovn_controller[95488]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 23:32:55 compute-0 ovn_controller[95488]: INFO:__main__:Validating config file
Dec 02 23:32:55 compute-0 ovn_controller[95488]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 23:32:55 compute-0 ovn_controller[95488]: INFO:__main__:Writing out command to execute
Dec 02 23:32:55 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Dec 02 23:32:55 compute-0 ovn_controller[95488]: ++ cat /run_command
Dec 02 23:32:55 compute-0 ovn_controller[95488]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 02 23:32:55 compute-0 ovn_controller[95488]: + ARGS=
Dec 02 23:32:55 compute-0 ovn_controller[95488]: + sudo kolla_copy_cacerts
Dec 02 23:32:55 compute-0 systemd[1]: Started Session c2 of User root.
Dec 02 23:32:55 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Dec 02 23:32:55 compute-0 ovn_controller[95488]: + [[ ! -n '' ]]
Dec 02 23:32:55 compute-0 ovn_controller[95488]: + . kolla_extend_start
Dec 02 23:32:55 compute-0 ovn_controller[95488]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 02 23:32:55 compute-0 ovn_controller[95488]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec 02 23:32:55 compute-0 ovn_controller[95488]: + umask 0022
Dec 02 23:32:55 compute-0 ovn_controller[95488]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Dec 02 23:32:55 compute-0 ovn_controller[95488]: 2025-12-02T23:32:55Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Dec 02 23:32:55 compute-0 NetworkManager[55671]: <info>  [1764718375.8545] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec 02 23:32:55 compute-0 NetworkManager[55671]: <info>  [1764718375.8553] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 02 23:32:55 compute-0 NetworkManager[55671]: <info>  [1764718375.8564] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Dec 02 23:32:55 compute-0 NetworkManager[55671]: <info>  [1764718375.8570] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Dec 02 23:32:55 compute-0 NetworkManager[55671]: <info>  [1764718375.8573] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 02 23:32:55 compute-0 kernel: br-int: entered promiscuous mode
Dec 02 23:32:55 compute-0 systemd-udevd[95625]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:32:56 compute-0 sudo[95753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojyvxhnvwqrbourjfppdrjckrogejlua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718375.9704137-1198-195422814502173/AnsiballZ_command.py'
Dec 02 23:32:56 compute-0 sudo[95753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:56 compute-0 python3.9[95755]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:56 compute-0 ovs-vsctl[95756]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 02 23:32:56 compute-0 sudo[95753]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00025|main|INFO|OVS feature set changed, force recompute.
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00029|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00030|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00032|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00033|features|INFO|OVS Feature: meter_support, state: supported
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00034|features|INFO|OVS Feature: group_support, state: supported
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00035|main|INFO|OVS feature set changed, force recompute.
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 02 23:32:56 compute-0 ovn_controller[95488]: 2025-12-02T23:32:56Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec 02 23:32:56 compute-0 NetworkManager[55671]: <info>  [1764718376.8930] manager: (ovn-5cbf2a-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec 02 23:32:56 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Dec 02 23:32:56 compute-0 NetworkManager[55671]: <info>  [1764718376.9212] device (genev_sys_6081): carrier: link connected
Dec 02 23:32:56 compute-0 NetworkManager[55671]: <info>  [1764718376.9218] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Dec 02 23:32:56 compute-0 NetworkManager[55671]: <info>  [1764718376.9516] manager: (ovn-e895a6-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec 02 23:32:57 compute-0 sudo[95909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbfaohliwjbqttmhggopzwhsmhguobxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718376.8129702-1214-204457946361310/AnsiballZ_command.py'
Dec 02 23:32:57 compute-0 sudo[95909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:57 compute-0 python3.9[95911]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:57 compute-0 ovs-vsctl[95913]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 02 23:32:57 compute-0 sudo[95909]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:58 compute-0 sudo[96064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgxosvjaljkbrqdydrifgewzrylplzmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718377.778613-1242-81048001644789/AnsiballZ_command.py'
Dec 02 23:32:58 compute-0 sudo[96064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:32:58 compute-0 python3.9[96066]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:32:58 compute-0 ovs-vsctl[96067]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 02 23:32:58 compute-0 sudo[96064]: pam_unix(sudo:session): session closed for user root
Dec 02 23:32:58 compute-0 sshd-session[84992]: Connection closed by 192.168.122.30 port 41134
Dec 02 23:32:58 compute-0 sshd-session[84989]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:32:58 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Dec 02 23:32:58 compute-0 systemd[1]: session-20.scope: Consumed 48.738s CPU time.
Dec 02 23:32:58 compute-0 systemd-logind[795]: Session 20 logged out. Waiting for processes to exit.
Dec 02 23:32:58 compute-0 systemd-logind[795]: Removed session 20.
Dec 02 23:33:04 compute-0 sshd-session[96093]: Accepted publickey for zuul from 192.168.122.30 port 33638 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:33:04 compute-0 systemd-logind[795]: New session 22 of user zuul.
Dec 02 23:33:04 compute-0 systemd[1]: Started Session 22 of User zuul.
Dec 02 23:33:04 compute-0 sshd-session[96093]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:33:05 compute-0 python3.9[96246]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:33:06 compute-0 systemd[1]: Stopping User Manager for UID 0...
Dec 02 23:33:06 compute-0 systemd[95517]: Activating special unit Exit the Session...
Dec 02 23:33:06 compute-0 systemd[95517]: Stopped target Main User Target.
Dec 02 23:33:06 compute-0 systemd[95517]: Stopped target Basic System.
Dec 02 23:33:06 compute-0 systemd[95517]: Stopped target Paths.
Dec 02 23:33:06 compute-0 systemd[95517]: Stopped target Sockets.
Dec 02 23:33:06 compute-0 systemd[95517]: Stopped target Timers.
Dec 02 23:33:06 compute-0 systemd[95517]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 02 23:33:06 compute-0 systemd[95517]: Closed D-Bus User Message Bus Socket.
Dec 02 23:33:06 compute-0 systemd[95517]: Stopped Create User's Volatile Files and Directories.
Dec 02 23:33:06 compute-0 systemd[95517]: Removed slice User Application Slice.
Dec 02 23:33:06 compute-0 systemd[95517]: Reached target Shutdown.
Dec 02 23:33:06 compute-0 systemd[95517]: Finished Exit the Session.
Dec 02 23:33:06 compute-0 systemd[95517]: Reached target Exit the Session.
Dec 02 23:33:06 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Dec 02 23:33:06 compute-0 systemd[1]: Stopped User Manager for UID 0.
Dec 02 23:33:06 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 02 23:33:06 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 02 23:33:06 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 02 23:33:06 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 02 23:33:06 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Dec 02 23:33:06 compute-0 sudo[96403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uppapuuoydlqwamfpbawkaotqegthshq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718385.8992925-48-47539136059306/AnsiballZ_file.py'
Dec 02 23:33:06 compute-0 sudo[96403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:06 compute-0 python3.9[96405]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:06 compute-0 sudo[96403]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:07 compute-0 sudo[96555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxrfacubougxgtpbdjkblhstzdksbxdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718386.8359065-48-164530018631931/AnsiballZ_file.py'
Dec 02 23:33:07 compute-0 sudo[96555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:07 compute-0 python3.9[96557]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:07 compute-0 sudo[96555]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:07 compute-0 sudo[96707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehaijvoykeapfiiiajhlwgfthjsmxdro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718387.556213-48-140353702574774/AnsiballZ_file.py'
Dec 02 23:33:07 compute-0 sudo[96707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:08 compute-0 python3.9[96709]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:08 compute-0 sudo[96707]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:08 compute-0 sudo[96859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqidcypxosqsctuhurqxhudtywvmatki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718388.186153-48-208012080282580/AnsiballZ_file.py'
Dec 02 23:33:08 compute-0 sudo[96859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:08 compute-0 python3.9[96861]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:08 compute-0 sudo[96859]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:09 compute-0 sudo[97013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywlhvmkibtgeoosimqmdnsevxefhyssp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718388.8627512-48-52924730946355/AnsiballZ_file.py'
Dec 02 23:33:09 compute-0 sudo[97013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:09 compute-0 python3.9[97015]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:09 compute-0 sudo[97013]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:10 compute-0 python3.9[97165]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:33:10 compute-0 sshd-session[96938]: Invalid user conectar from 45.78.218.154 port 44880
Dec 02 23:33:11 compute-0 sshd-session[96938]: Received disconnect from 45.78.218.154 port 44880:11: Bye Bye [preauth]
Dec 02 23:33:11 compute-0 sshd-session[96938]: Disconnected from invalid user conectar 45.78.218.154 port 44880 [preauth]
Dec 02 23:33:11 compute-0 sudo[97317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwvmfxzlnxlvsdldzdkqznkqbyfgkakk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718390.6593544-136-66794986491287/AnsiballZ_seboolean.py'
Dec 02 23:33:11 compute-0 sudo[97317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:11 compute-0 sshd-session[97166]: Invalid user prueba from 80.94.95.115 port 41338
Dec 02 23:33:11 compute-0 python3.9[97319]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 02 23:33:11 compute-0 sshd-session[97166]: Connection closed by invalid user prueba 80.94.95.115 port 41338 [preauth]
Dec 02 23:33:11 compute-0 sudo[97317]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:12 compute-0 python3.9[97469]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:13 compute-0 sshd-session[96092]: error: kex_exchange_identification: read: Connection timed out
Dec 02 23:33:13 compute-0 sshd-session[96092]: banner exchange: Connection from 14.103.141.38 port 57804: Connection timed out
Dec 02 23:33:13 compute-0 python3.9[97590]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718392.188761-152-130022167540727/.source follow=False _original_basename=haproxy.j2 checksum=66fe13ac5fc047d8fb3860998b97ca468880e317 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:14 compute-0 python3.9[97740]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:14 compute-0 python3.9[97861]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718393.8051295-182-50050098009513/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:15 compute-0 sudo[98012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjdxhjjxvddtzczedypgrkuwxvnoyppc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718395.4223545-216-94435675645496/AnsiballZ_setup.py'
Dec 02 23:33:15 compute-0 sudo[98012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:16 compute-0 python3.9[98014]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:33:16 compute-0 sudo[98012]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:16 compute-0 sudo[98096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kopdornxztcdvjrlgidanuefkxwudxwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718395.4223545-216-94435675645496/AnsiballZ_dnf.py'
Dec 02 23:33:16 compute-0 sudo[98096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:16 compute-0 python3.9[98098]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:33:18 compute-0 sudo[98096]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:19 compute-0 sudo[98249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kacrsiupvuoehgpbnqrtpoasnmwpffrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718398.502985-240-110575755831843/AnsiballZ_systemd.py'
Dec 02 23:33:19 compute-0 sudo[98249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:19 compute-0 python3.9[98251]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:33:19 compute-0 sudo[98249]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:21 compute-0 python3.9[98404]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:21 compute-0 python3.9[98525]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718400.9419043-256-185116193404942/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:22 compute-0 python3.9[98675]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:23 compute-0 python3.9[98796]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718402.078452-256-125012421962969/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:24 compute-0 python3.9[98946]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:25 compute-0 python3.9[99067]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718404.298167-344-104159632884078/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:25 compute-0 ovn_controller[95488]: 2025-12-02T23:33:25Z|00038|memory|INFO|15956 kB peak resident set size after 29.7 seconds
Dec 02 23:33:25 compute-0 ovn_controller[95488]: 2025-12-02T23:33:25Z|00039|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Dec 02 23:33:25 compute-0 podman[99068]: 2025-12-02 23:33:25.569671736 +0000 UTC m=+0.112861273 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 02 23:33:26 compute-0 python3.9[99242]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:26 compute-0 python3.9[99363]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718405.6467264-344-185140437124275/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:27 compute-0 python3.9[99513]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:33:28 compute-0 sudo[99665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luqeuumxsiylfdkktlcbuoapemezdnui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718408.3922677-420-17784116369320/AnsiballZ_file.py'
Dec 02 23:33:28 compute-0 sudo[99665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:28 compute-0 python3.9[99667]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:28 compute-0 sudo[99665]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:29 compute-0 sudo[99817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueynywqqlwwnhcozaupbqhsvpuibzqoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718409.285624-436-29085587201668/AnsiballZ_stat.py'
Dec 02 23:33:29 compute-0 sudo[99817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:29 compute-0 python3.9[99819]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:29 compute-0 sudo[99817]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:30 compute-0 sudo[99895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxkivlqssaearrmxnytkfsoiuewuyqxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718409.285624-436-29085587201668/AnsiballZ_file.py'
Dec 02 23:33:30 compute-0 sudo[99895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:30 compute-0 python3.9[99897]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:30 compute-0 sudo[99895]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:30 compute-0 sudo[100047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btrlkjnvrxivymikuitmixvymhpatvzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718410.3958988-436-245402993550701/AnsiballZ_stat.py'
Dec 02 23:33:30 compute-0 sudo[100047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:30 compute-0 python3.9[100049]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:30 compute-0 sudo[100047]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:31 compute-0 sudo[100125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whjwmiomefexelzywaeoazfcgkqyftwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718410.3958988-436-245402993550701/AnsiballZ_file.py'
Dec 02 23:33:31 compute-0 sudo[100125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:31 compute-0 python3.9[100127]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:31 compute-0 sudo[100125]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:32 compute-0 sudo[100277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gebbtoafrspbheimgzjtytbqtjlezdzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718412.2348163-482-76199510457740/AnsiballZ_file.py'
Dec 02 23:33:32 compute-0 sudo[100277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:32 compute-0 python3.9[100279]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:32 compute-0 sudo[100277]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:33 compute-0 sudo[100429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrjczxwkcnfnimurhptqmjommnlhrcju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718413.060581-498-54417658237605/AnsiballZ_stat.py'
Dec 02 23:33:33 compute-0 sudo[100429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:33 compute-0 python3.9[100431]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:33 compute-0 sudo[100429]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:33 compute-0 sudo[100507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydiotsxhsrjxajpjgtejrcsxihvvmzuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718413.060581-498-54417658237605/AnsiballZ_file.py'
Dec 02 23:33:33 compute-0 sudo[100507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:33 compute-0 python3.9[100509]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:33 compute-0 sudo[100507]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:34 compute-0 sudo[100659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yconrgmmjczfhtkzkllerxtljewhykhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718414.491082-522-268441516098243/AnsiballZ_stat.py'
Dec 02 23:33:34 compute-0 sudo[100659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:35 compute-0 python3.9[100661]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:35 compute-0 sudo[100659]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:35 compute-0 sudo[100737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjnjxufdokstbvppoaqrtfnfyrqiancy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718414.491082-522-268441516098243/AnsiballZ_file.py'
Dec 02 23:33:35 compute-0 sudo[100737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:35 compute-0 python3.9[100739]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:35 compute-0 sudo[100737]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:36 compute-0 sudo[100889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgmzmevuawauvtmncgmnntgytrrkrjkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718415.9070199-546-42858640320331/AnsiballZ_systemd.py'
Dec 02 23:33:36 compute-0 sudo[100889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:36 compute-0 python3.9[100891]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:33:36 compute-0 systemd[1]: Reloading.
Dec 02 23:33:36 compute-0 systemd-rc-local-generator[100915]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:33:36 compute-0 systemd-sysv-generator[100919]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:33:36 compute-0 sudo[100889]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:37 compute-0 sudo[101078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nppwnemrvsvzrtvsuhtxgwywfbivbhwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718417.2675323-562-20033262646160/AnsiballZ_stat.py'
Dec 02 23:33:37 compute-0 sudo[101078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:37 compute-0 python3.9[101080]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:37 compute-0 sudo[101078]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:37 compute-0 sudo[101156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmpsqqzayogijiafcbtwugjawcboucwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718417.2675323-562-20033262646160/AnsiballZ_file.py'
Dec 02 23:33:37 compute-0 sudo[101156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:38 compute-0 python3.9[101158]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:38 compute-0 sudo[101156]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:38 compute-0 sudo[101308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfjmnphpboaedszmeiylcbpfgtpsohfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718418.6670523-586-118918946652540/AnsiballZ_stat.py'
Dec 02 23:33:38 compute-0 sudo[101308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:39 compute-0 python3.9[101310]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:39 compute-0 sudo[101308]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:39 compute-0 sudo[101386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpabowktisrjpxymcxbsdlpafnxkoruz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718418.6670523-586-118918946652540/AnsiballZ_file.py'
Dec 02 23:33:39 compute-0 sudo[101386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:39 compute-0 python3.9[101388]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:39 compute-0 sudo[101386]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:40 compute-0 sudo[101538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctipjiyxkhuabucvlpapwdozkaciqzuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718420.1234145-610-226473078155819/AnsiballZ_systemd.py'
Dec 02 23:33:40 compute-0 sudo[101538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:40 compute-0 python3.9[101540]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:33:40 compute-0 systemd[1]: Reloading.
Dec 02 23:33:40 compute-0 systemd-rc-local-generator[101570]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:33:40 compute-0 systemd-sysv-generator[101575]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:33:41 compute-0 systemd[1]: Starting Create netns directory...
Dec 02 23:33:41 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 23:33:41 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 23:33:41 compute-0 systemd[1]: Finished Create netns directory.
Dec 02 23:33:41 compute-0 sudo[101538]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:41 compute-0 sudo[101733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqnyshthoievdlbzknbneadgscgethbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718421.5425851-630-129088746722719/AnsiballZ_file.py'
Dec 02 23:33:41 compute-0 sudo[101733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:42 compute-0 python3.9[101735]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:42 compute-0 sudo[101733]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:42 compute-0 sudo[101885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpcjkzzmpioohekihyvfscbhumhneril ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718422.4354017-646-177079551057408/AnsiballZ_stat.py'
Dec 02 23:33:42 compute-0 sudo[101885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:42 compute-0 python3.9[101887]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:42 compute-0 sudo[101885]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:43 compute-0 sudo[102008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fshyzgpisjgfibkjncioqxezxeemckal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718422.4354017-646-177079551057408/AnsiballZ_copy.py'
Dec 02 23:33:43 compute-0 sudo[102008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:43 compute-0 python3.9[102010]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718422.4354017-646-177079551057408/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:43 compute-0 sudo[102008]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:44 compute-0 sudo[102160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckegmdeejefcshhibjeooxgldwjivzil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718424.1204257-680-248788064714865/AnsiballZ_file.py'
Dec 02 23:33:44 compute-0 sudo[102160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:44 compute-0 python3.9[102162]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:33:44 compute-0 sudo[102160]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:45 compute-0 sudo[102312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snolnpdkvdbmtssbsgmwutzozzzdcqdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718424.95518-696-65509951831432/AnsiballZ_stat.py'
Dec 02 23:33:45 compute-0 sudo[102312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:45 compute-0 python3.9[102314]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:33:45 compute-0 sudo[102312]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:45 compute-0 sudo[102435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgzwlgndkwzyfvybvhyusxirmukwrlmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718424.95518-696-65509951831432/AnsiballZ_copy.py'
Dec 02 23:33:46 compute-0 sudo[102435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:46 compute-0 python3.9[102437]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718424.95518-696-65509951831432/.source.json _original_basename=.aeh_283e follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:46 compute-0 sudo[102435]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:46 compute-0 sudo[102589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekokljlvtzlgogiezurgvirmloquwmeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718426.4520693-726-228482507160271/AnsiballZ_file.py'
Dec 02 23:33:46 compute-0 sudo[102589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:47 compute-0 python3.9[102591]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:47 compute-0 sudo[102589]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:47 compute-0 sshd-session[102438]: Received disconnect from 49.247.36.49 port 51782:11: Bye Bye [preauth]
Dec 02 23:33:47 compute-0 sshd-session[102438]: Disconnected from authenticating user root 49.247.36.49 port 51782 [preauth]
Dec 02 23:33:47 compute-0 sudo[102741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbncjqfqfaznkvtnjueavkenummtllss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718427.3671935-742-85986002243470/AnsiballZ_stat.py'
Dec 02 23:33:47 compute-0 sudo[102741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:47 compute-0 sudo[102741]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:48 compute-0 sudo[102864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbwhqovkgbtckvpipbqgeswqhwbnlaau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718427.3671935-742-85986002243470/AnsiballZ_copy.py'
Dec 02 23:33:48 compute-0 sudo[102864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:48 compute-0 sudo[102864]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:49 compute-0 sudo[103017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xliyjvzvqjbiqxkjlmqriwisiqvlobda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718429.3256228-776-270281481130081/AnsiballZ_container_config_data.py'
Dec 02 23:33:49 compute-0 sudo[103017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:49 compute-0 python3.9[103019]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 02 23:33:49 compute-0 sudo[103017]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:50 compute-0 sudo[103169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apckoiqpnlwksmezqqfzemizjmuzagqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718430.1738536-794-169309870936918/AnsiballZ_container_config_hash.py'
Dec 02 23:33:50 compute-0 sudo[103169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:50 compute-0 python3.9[103171]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:33:50 compute-0 sudo[103169]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:51 compute-0 sudo[103321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prypiqgefdfcolxnngdtyqcaxpwkumbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718431.294629-812-203677932992307/AnsiballZ_podman_container_info.py'
Dec 02 23:33:51 compute-0 sudo[103321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:52 compute-0 python3.9[103323]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 02 23:33:52 compute-0 sudo[103321]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:53 compute-0 sudo[103499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqokuskdaxeejelybeagharabjconsta ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718432.924247-838-9628528042505/AnsiballZ_edpm_container_manage.py'
Dec 02 23:33:53 compute-0 sudo[103499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:53 compute-0 python3[103501]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:33:53 compute-0 podman[103536]: 2025-12-02 23:33:53.900967992 +0000 UTC m=+0.049616422 container create 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 23:33:53 compute-0 podman[103536]: 2025-12-02 23:33:53.871367947 +0000 UTC m=+0.020016387 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 02 23:33:53 compute-0 python3[103501]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 02 23:33:54 compute-0 sudo[103499]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:54 compute-0 sudo[103722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfetvockrgidwjuavgayddcqckzwtbal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718434.2298498-854-89154229248685/AnsiballZ_stat.py'
Dec 02 23:33:54 compute-0 sudo[103722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:54 compute-0 python3.9[103724]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:33:54 compute-0 sudo[103722]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:55 compute-0 sudo[103876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-armbkyysjicsarqbleavoimlsrwssgpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718435.2670403-872-116242818385010/AnsiballZ_file.py'
Dec 02 23:33:55 compute-0 sudo[103876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:55 compute-0 python3.9[103878]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:55 compute-0 sudo[103876]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:56 compute-0 sudo[103962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilxoxljyifqfcibxlrakoxxgugdigfpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718435.2670403-872-116242818385010/AnsiballZ_stat.py'
Dec 02 23:33:56 compute-0 sudo[103962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:56 compute-0 podman[103926]: 2025-12-02 23:33:56.084414135 +0000 UTC m=+0.099716338 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:33:56 compute-0 python3.9[103966]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:33:56 compute-0 sudo[103962]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:56 compute-0 sudo[104128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rykhwnjbdrruiyvufvblshzempxnxjor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718436.321362-872-233469447799075/AnsiballZ_copy.py'
Dec 02 23:33:56 compute-0 sudo[104128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:56 compute-0 python3.9[104130]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764718436.321362-872-233469447799075/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:33:56 compute-0 sudo[104128]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:57 compute-0 sudo[104204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unficxjkfcizeanumisgjdikayueeeqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718436.321362-872-233469447799075/AnsiballZ_systemd.py'
Dec 02 23:33:57 compute-0 sudo[104204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:57 compute-0 python3.9[104206]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:33:57 compute-0 systemd[1]: Reloading.
Dec 02 23:33:57 compute-0 systemd-rc-local-generator[104231]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:33:57 compute-0 systemd-sysv-generator[104234]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:33:57 compute-0 sudo[104204]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:57 compute-0 sudo[104315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykurieiciixbgljdzhvhhbndpnctadzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718436.321362-872-233469447799075/AnsiballZ_systemd.py'
Dec 02 23:33:57 compute-0 sudo[104315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:33:58 compute-0 python3.9[104317]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:33:58 compute-0 systemd[1]: Reloading.
Dec 02 23:33:58 compute-0 systemd-rc-local-generator[104350]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:33:58 compute-0 systemd-sysv-generator[104353]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:33:58 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Dec 02 23:33:58 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:33:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb428239d84dbd11915797d1b40495a73aaaa34ead4b524d30f54d33ead493c9/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 02 23:33:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb428239d84dbd11915797d1b40495a73aaaa34ead4b524d30f54d33ead493c9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 23:33:58 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41.
Dec 02 23:33:58 compute-0 podman[104358]: 2025-12-02 23:33:58.696975593 +0000 UTC m=+0.120977665 container init 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.4)
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: + sudo -E kolla_set_configs
Dec 02 23:33:58 compute-0 podman[104358]: 2025-12-02 23:33:58.725276316 +0000 UTC m=+0.149278388 container start 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 02 23:33:58 compute-0 edpm-start-podman-container[104358]: ovn_metadata_agent
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Validating config file
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Copying service configuration files
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Writing out command to execute
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 02 23:33:58 compute-0 edpm-start-podman-container[104357]: Creating additional drop-in dependency for "ovn_metadata_agent" (282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41)
Dec 02 23:33:58 compute-0 podman[104381]: 2025-12-02 23:33:58.786363027 +0000 UTC m=+0.051089269 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: ++ cat /run_command
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: + CMD=neutron-ovn-metadata-agent
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: + ARGS=
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: + sudo kolla_copy_cacerts
Dec 02 23:33:58 compute-0 systemd[1]: Reloading.
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: + [[ ! -n '' ]]
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: + . kolla_extend_start
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: Running command: 'neutron-ovn-metadata-agent'
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: + umask 0022
Dec 02 23:33:58 compute-0 ovn_metadata_agent[104374]: + exec neutron-ovn-metadata-agent
Dec 02 23:33:58 compute-0 systemd-sysv-generator[104455]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:33:58 compute-0 systemd-rc-local-generator[104452]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:33:59 compute-0 systemd[1]: Started ovn_metadata_agent container.
Dec 02 23:33:59 compute-0 sudo[104315]: pam_unix(sudo:session): session closed for user root
Dec 02 23:33:59 compute-0 sshd-session[102891]: error: kex_exchange_identification: read: Connection timed out
Dec 02 23:33:59 compute-0 sshd-session[102891]: banner exchange: Connection from 183.232.230.82 port 43015: Connection timed out
Dec 02 23:34:00 compute-0 sshd-session[96096]: Connection closed by 192.168.122.30 port 33638
Dec 02 23:34:00 compute-0 sshd-session[96093]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:34:00 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Dec 02 23:34:00 compute-0 systemd[1]: session-22.scope: Consumed 36.269s CPU time.
Dec 02 23:34:00 compute-0 systemd-logind[795]: Session 22 logged out. Waiting for processes to exit.
Dec 02 23:34:00 compute-0 systemd-logind[795]: Removed session 22.
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.614 104379 INFO neutron.common.config [-] Logging enabled!
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.614 104379 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.614 104379 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.615 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.615 104379 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.615 104379 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.615 104379 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.615 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.615 104379 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.615 104379 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.615 104379 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.616 104379 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.617 104379 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.618 104379 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.619 104379 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.77 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.620 104379 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.621 104379 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.622 104379 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.623 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.624 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.624 104379 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.624 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.624 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.624 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.624 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.624 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.624 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.624 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.624 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.624 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.624 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.624 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.625 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.625 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.625 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.625 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.625 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.625 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.625 104379 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.625 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.625 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.625 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.625 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.625 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.626 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.627 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.628 104379 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.629 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.630 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.631 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.631 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.631 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.631 104379 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.631 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.631 104379 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.631 104379 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.631 104379 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.631 104379 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.631 104379 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.631 104379 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.632 104379 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.632 104379 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.632 104379 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.632 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.632 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.632 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.632 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.632 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.632 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.632 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.632 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.632 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.633 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.634 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.635 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.636 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.637 104379 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.638 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.639 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.639 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.639 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.639 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.639 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.639 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.639 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.639 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.639 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.639 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.639 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.640 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.641 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.641 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.641 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.641 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.641 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.641 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.641 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.641 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.641 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.641 104379 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.641 104379 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.648 104379 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.648 104379 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.649 104379 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.649 104379 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.649 104379 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.657 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 83290d9e-bd8f-4c21-b54d-356f7c3da39f (UUID: 83290d9e-bd8f-4c21-b54d-356f7c3da39f) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.684 104379 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.685 104379 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.685 104379 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.685 104379 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.685 104379 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.688 104379 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.693 104379 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.701 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '83290d9e-bd8f-4c21-b54d-356f7c3da39f'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], external_ids={}, name=83290d9e-bd8f-4c21-b54d-356f7c3da39f, nb_cfg_timestamp=1764718384886, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:00.703 104379 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpkzuvz8dv/privsep.sock']
Dec 02 23:34:01 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 02 23:34:01 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:01.342 104379 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 02 23:34:01 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:01.342 104379 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpkzuvz8dv/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Dec 02 23:34:01 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:01.219 104499 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 23:34:01 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:01.222 104499 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 23:34:01 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:01.224 104499 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 02 23:34:01 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:01.224 104499 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104499
Dec 02 23:34:01 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:01.344 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[be6fae75-6c0e-4b69-a694-0739c7ba9bf5]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:34:01 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:01.808 104499 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:34:01 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:01.808 104499 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:34:01 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:01.808 104499 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:34:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:02.245 104499 INFO oslo_service.backend [-] Loading backend: eventlet
Dec 02 23:34:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:02.250 104499 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Dec 02 23:34:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:02.285 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[6fda2059-c6da-4269-a97f-5b53d81dcace]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:34:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:02.286 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, column=external_ids, values=({'neutron:ovn-metadata-id': '8065dc3e-0b77-58df-8cc7-d6c5fc5f3437'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:34:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:02.294 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:34:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:34:02.307 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:34:05 compute-0 sshd-session[104504]: Accepted publickey for zuul from 192.168.122.30 port 60008 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:34:05 compute-0 systemd-logind[795]: New session 23 of user zuul.
Dec 02 23:34:05 compute-0 systemd[1]: Started Session 23 of User zuul.
Dec 02 23:34:05 compute-0 sshd-session[104504]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:34:06 compute-0 python3.9[104657]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:34:08 compute-0 sudo[104811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgpwxuuwtnsewvrqtgckzptdsqralbrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718447.6558344-48-205654636989448/AnsiballZ_command.py'
Dec 02 23:34:08 compute-0 sudo[104811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:08 compute-0 python3.9[104813]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:08 compute-0 sudo[104811]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:09 compute-0 sudo[104975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkrcraqokatljocfaxafgehmjmuljfae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718448.8021126-70-274516163188920/AnsiballZ_systemd_service.py'
Dec 02 23:34:09 compute-0 sudo[104975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:09 compute-0 python3.9[104977]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:34:09 compute-0 systemd[1]: Reloading.
Dec 02 23:34:09 compute-0 systemd-rc-local-generator[105003]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:34:09 compute-0 systemd-sysv-generator[105007]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:34:09 compute-0 sudo[104975]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:10 compute-0 python3.9[105162]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:34:10 compute-0 network[105179]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:34:10 compute-0 network[105180]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:34:10 compute-0 network[105181]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:34:16 compute-0 sudo[105440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhjhcssjmtxsgsdunzyukzyaaetmodmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718456.2580516-108-255879659458980/AnsiballZ_systemd_service.py'
Dec 02 23:34:16 compute-0 sudo[105440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:16 compute-0 python3.9[105442]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:17 compute-0 sudo[105440]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:17 compute-0 sudo[105593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qumnlcbzfymfcwosndwdaaxsplsfczmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718457.1835618-108-199759127154843/AnsiballZ_systemd_service.py'
Dec 02 23:34:17 compute-0 sudo[105593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:17 compute-0 python3.9[105595]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:17 compute-0 sudo[105593]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:18 compute-0 sudo[105746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tafveuhuosvemvcozcxxcrtzwixycjxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718457.9365306-108-67570157126754/AnsiballZ_systemd_service.py'
Dec 02 23:34:18 compute-0 sudo[105746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:18 compute-0 python3.9[105748]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:18 compute-0 sudo[105746]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:19 compute-0 sudo[105899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-textbzlschvuacernsgqdwtrjavaavpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718458.670931-108-115013883308150/AnsiballZ_systemd_service.py'
Dec 02 23:34:19 compute-0 sudo[105899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:19 compute-0 python3.9[105901]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:19 compute-0 sudo[105899]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:19 compute-0 sudo[106052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwzlojhpskpwfuahvkomflztsrrpahah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718459.6197658-108-272917835038376/AnsiballZ_systemd_service.py'
Dec 02 23:34:19 compute-0 sudo[106052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:20 compute-0 python3.9[106054]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:20 compute-0 sudo[106052]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:20 compute-0 sudo[106205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orzkmjtztytglknzrdroxzakzegkhreo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718460.3583558-108-71690030425330/AnsiballZ_systemd_service.py'
Dec 02 23:34:20 compute-0 sudo[106205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:20 compute-0 python3.9[106207]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:20 compute-0 sudo[106205]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:21 compute-0 sudo[106358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phpjhwtipoitwmsabkgndkmqxyqockwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718461.1304674-108-137346977872740/AnsiballZ_systemd_service.py'
Dec 02 23:34:21 compute-0 sudo[106358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:21 compute-0 python3.9[106360]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:34:21 compute-0 sudo[106358]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:22 compute-0 sudo[106511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcflubqtvbsaoevukqjhaufrvvhpzyhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718462.3620107-212-218870977191468/AnsiballZ_file.py'
Dec 02 23:34:22 compute-0 sudo[106511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:23 compute-0 python3.9[106513]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:23 compute-0 sudo[106511]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:23 compute-0 sudo[106663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bopjycvnonhvzyhkexfjvfcwdhvwzmxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718463.2604837-212-118750381565781/AnsiballZ_file.py'
Dec 02 23:34:23 compute-0 sudo[106663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:23 compute-0 python3.9[106665]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:23 compute-0 sudo[106663]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:24 compute-0 sudo[106815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyusnkneysalblfobdqqhgqvdgxiitqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718463.882101-212-87875651388819/AnsiballZ_file.py'
Dec 02 23:34:24 compute-0 sudo[106815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:24 compute-0 python3.9[106817]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:24 compute-0 sudo[106815]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:24 compute-0 sudo[106967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikaatckjiihtwhwsecfpskkpbdvcogib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718464.51025-212-6188881808005/AnsiballZ_file.py'
Dec 02 23:34:24 compute-0 sudo[106967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:25 compute-0 python3.9[106969]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:25 compute-0 sudo[106967]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:25 compute-0 sudo[107119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asnlddexmuoivupifkittgzslruuxtpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718465.2044828-212-155728476775597/AnsiballZ_file.py'
Dec 02 23:34:25 compute-0 sudo[107119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:25 compute-0 python3.9[107121]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:25 compute-0 sudo[107119]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:26 compute-0 sudo[107286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yutbkrosiagpfybmekskbvrpwvqarulo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718465.8755174-212-113088551580584/AnsiballZ_file.py'
Dec 02 23:34:26 compute-0 sudo[107286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:26 compute-0 podman[107245]: 2025-12-02 23:34:26.323782012 +0000 UTC m=+0.115960413 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 23:34:26 compute-0 python3.9[107291]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:26 compute-0 sudo[107286]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:26 compute-0 sudo[107449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgdphingczkifvtpdvunzrfhrlbznxpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718466.647917-212-66144650726685/AnsiballZ_file.py'
Dec 02 23:34:26 compute-0 sudo[107449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:27 compute-0 python3.9[107451]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:27 compute-0 sudo[107449]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:27 compute-0 sudo[107601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sabiwsgovlwwtoghzysgubhisvlpesyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718467.6490347-312-260127965306182/AnsiballZ_file.py'
Dec 02 23:34:27 compute-0 sudo[107601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:28 compute-0 python3.9[107603]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:28 compute-0 sudo[107601]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:28 compute-0 sudo[107753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icezogeimfsolymmifypdzmyuinmytnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718468.3270261-312-37138445513464/AnsiballZ_file.py'
Dec 02 23:34:28 compute-0 sudo[107753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:28 compute-0 python3.9[107755]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:28 compute-0 sudo[107753]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:29 compute-0 podman[107780]: 2025-12-02 23:34:29.103359128 +0000 UTC m=+0.052400111 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 02 23:34:29 compute-0 sudo[107924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykjotthawmasylbfwzdgsznmzustjtkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718469.07143-312-204569438871358/AnsiballZ_file.py'
Dec 02 23:34:29 compute-0 sudo[107924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:29 compute-0 python3.9[107926]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:29 compute-0 sudo[107924]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:30 compute-0 sudo[108076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jykktwjpgakonllndaytyboqyyhuyjja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718469.7647557-312-116974836113219/AnsiballZ_file.py'
Dec 02 23:34:30 compute-0 sudo[108076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:30 compute-0 python3.9[108078]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:30 compute-0 sudo[108076]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:30 compute-0 sudo[108230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liuxhwdsthgsbbfjoqrpgqtwpwkihapm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718470.3205602-312-192167023798314/AnsiballZ_file.py'
Dec 02 23:34:30 compute-0 sudo[108230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:30 compute-0 python3.9[108232]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:30 compute-0 sudo[108230]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:31 compute-0 sudo[108382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clnoulvxiqyqqzyzaufwupmrobnyosbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718470.9747434-312-244367359949597/AnsiballZ_file.py'
Dec 02 23:34:31 compute-0 sudo[108382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:31 compute-0 python3.9[108384]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:31 compute-0 sudo[108382]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:32 compute-0 sudo[108534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlqqrouoqourprsbozboqzbuwleagnwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718471.7306209-312-232186144545880/AnsiballZ_file.py'
Dec 02 23:34:32 compute-0 sudo[108534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:32 compute-0 python3.9[108536]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:34:32 compute-0 sudo[108534]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:32 compute-0 sshd-session[108126]: Invalid user es from 45.78.219.213 port 35960
Dec 02 23:34:32 compute-0 sshd-session[108126]: Received disconnect from 45.78.219.213 port 35960:11: Bye Bye [preauth]
Dec 02 23:34:32 compute-0 sshd-session[108126]: Disconnected from invalid user es 45.78.219.213 port 35960 [preauth]
Dec 02 23:34:33 compute-0 sudo[108686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpgjdswffzkdurznyojlzlpvxjwydjhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718473.0291378-414-61470099829016/AnsiballZ_command.py'
Dec 02 23:34:33 compute-0 sudo[108686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:33 compute-0 python3.9[108688]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:33 compute-0 sudo[108686]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:34 compute-0 python3.9[108840]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 23:34:35 compute-0 sudo[108990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyxnddxzjxkyvstgizfnsqaetqqzxkhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718474.882568-450-63493801290450/AnsiballZ_systemd_service.py'
Dec 02 23:34:35 compute-0 sudo[108990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:35 compute-0 python3.9[108992]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:34:35 compute-0 systemd[1]: Reloading.
Dec 02 23:34:35 compute-0 systemd-sysv-generator[109025]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:34:35 compute-0 systemd-rc-local-generator[109020]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:34:35 compute-0 sudo[108990]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:36 compute-0 sudo[109178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaztakvtfthzsehjbecaiknyoiqxqcuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718476.146628-466-238586957665423/AnsiballZ_command.py'
Dec 02 23:34:36 compute-0 sudo[109178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:36 compute-0 python3.9[109180]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:36 compute-0 sudo[109178]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:37 compute-0 sudo[109331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfaagccqiknfuxqvelelneupkghjsmmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718476.971674-466-41711139983193/AnsiballZ_command.py'
Dec 02 23:34:37 compute-0 sudo[109331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:37 compute-0 python3.9[109333]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:37 compute-0 sudo[109331]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:38 compute-0 sudo[109484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erpdpbitwkamsxwqlwxbmtvqgoonmven ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718477.6648204-466-80911454186366/AnsiballZ_command.py'
Dec 02 23:34:38 compute-0 sudo[109484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:38 compute-0 python3.9[109486]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:38 compute-0 sudo[109484]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:38 compute-0 sudo[109637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jewngqbgpvrzdogrjcvdcwbpkckwurlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718478.4323661-466-149703386305027/AnsiballZ_command.py'
Dec 02 23:34:38 compute-0 sudo[109637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:38 compute-0 python3.9[109639]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:38 compute-0 sudo[109637]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:39 compute-0 sudo[109790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jprewdhdlzdtahmwfhbqjfhelrzgzfhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718479.1464946-466-97606062702457/AnsiballZ_command.py'
Dec 02 23:34:39 compute-0 sudo[109790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:39 compute-0 python3.9[109792]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:39 compute-0 sudo[109790]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:40 compute-0 sudo[109943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywibclkxqcjjbxxztlxglnpikgfvhium ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718479.7600386-466-13909757590341/AnsiballZ_command.py'
Dec 02 23:34:40 compute-0 sudo[109943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:40 compute-0 python3.9[109945]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:40 compute-0 sudo[109943]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:40 compute-0 sudo[110096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atkqqxxqlzgtxvezwijaonesckvisibg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718480.4088912-466-263608691891225/AnsiballZ_command.py'
Dec 02 23:34:40 compute-0 sudo[110096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:40 compute-0 python3.9[110098]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:34:40 compute-0 sudo[110096]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:42 compute-0 sudo[110249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfsrhkkfkzhougjghrtzehvxrmcwgiir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718482.048177-574-149590289945574/AnsiballZ_getent.py'
Dec 02 23:34:42 compute-0 sudo[110249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:42 compute-0 python3.9[110251]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 02 23:34:42 compute-0 sudo[110249]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:43 compute-0 sudo[110402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oamkpwoiqtsniehiswspmwznpozjvuzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718483.098021-590-17550489999017/AnsiballZ_group.py'
Dec 02 23:34:43 compute-0 sudo[110402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:43 compute-0 python3.9[110404]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 23:34:43 compute-0 groupadd[110405]: group added to /etc/group: name=libvirt, GID=42473
Dec 02 23:34:43 compute-0 groupadd[110405]: group added to /etc/gshadow: name=libvirt
Dec 02 23:34:43 compute-0 groupadd[110405]: new group: name=libvirt, GID=42473
Dec 02 23:34:43 compute-0 sudo[110402]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:44 compute-0 sudo[110560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oequzwevelmlogjovnubyskxbzydxemm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718484.1492915-606-35498296717068/AnsiballZ_user.py'
Dec 02 23:34:44 compute-0 sudo[110560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:44 compute-0 python3.9[110562]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 02 23:34:44 compute-0 useradd[110564]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Dec 02 23:34:44 compute-0 sudo[110560]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:45 compute-0 sudo[110720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlecwnkjvxtgjlkoqjgglspjlfpnzwau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718485.4974573-628-149851323306176/AnsiballZ_setup.py'
Dec 02 23:34:45 compute-0 sudo[110720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:46 compute-0 python3.9[110722]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:34:46 compute-0 sudo[110720]: pam_unix(sudo:session): session closed for user root
Dec 02 23:34:46 compute-0 sudo[110804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gasygnqscublclxuzzcbojdhueshoraz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718485.4974573-628-149851323306176/AnsiballZ_dnf.py'
Dec 02 23:34:46 compute-0 sudo[110804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:34:47 compute-0 python3.9[110806]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:34:57 compute-0 podman[110943]: 2025-12-02 23:34:57.176297685 +0000 UTC m=+0.120897496 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 02 23:35:00 compute-0 podman[111018]: 2025-12-02 23:35:00.10644196 +0000 UTC m=+0.060185718 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 23:35:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:35:00.642 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:35:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:35:00.643 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:35:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:35:00.643 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:35:15 compute-0 kernel: SELinux:  Converting 2759 SID table entries...
Dec 02 23:35:15 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:35:15 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 02 23:35:15 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:35:15 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:35:15 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:35:15 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:35:15 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:35:26 compute-0 kernel: SELinux:  Converting 2759 SID table entries...
Dec 02 23:35:26 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:35:26 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 02 23:35:26 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:35:26 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:35:26 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:35:26 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:35:26 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:35:28 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 02 23:35:28 compute-0 podman[111060]: 2025-12-02 23:35:28.165660102 +0000 UTC m=+0.114784438 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 02 23:35:31 compute-0 podman[111088]: 2025-12-02 23:35:31.107979194 +0000 UTC m=+0.066115543 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Dec 02 23:35:42 compute-0 sshd-session[111086]: Received disconnect from 45.78.218.154 port 56458:11: Bye Bye [preauth]
Dec 02 23:35:42 compute-0 sshd-session[111086]: Disconnected from 45.78.218.154 port 56458 [preauth]
Dec 02 23:35:59 compute-0 podman[122448]: 2025-12-02 23:35:59.219643581 +0000 UTC m=+0.168437141 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 02 23:36:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:36:00.643 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:36:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:36:00.644 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:36:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:36:00.644 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:36:02 compute-0 podman[124141]: 2025-12-02 23:36:02.139824354 +0000 UTC m=+0.081607212 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 02 23:36:04 compute-0 sshd-session[124839]: Received disconnect from 49.247.36.49 port 44359:11: Bye Bye [preauth]
Dec 02 23:36:04 compute-0 sshd-session[124839]: Disconnected from authenticating user root 49.247.36.49 port 44359 [preauth]
Dec 02 23:36:23 compute-0 kernel: SELinux:  Converting 2760 SID table entries...
Dec 02 23:36:23 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 02 23:36:23 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 02 23:36:23 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 02 23:36:23 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 02 23:36:23 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 02 23:36:23 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 02 23:36:23 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 02 23:36:24 compute-0 groupadd[127974]: group added to /etc/group: name=dnsmasq, GID=992
Dec 02 23:36:24 compute-0 groupadd[127974]: group added to /etc/gshadow: name=dnsmasq
Dec 02 23:36:24 compute-0 groupadd[127974]: new group: name=dnsmasq, GID=992
Dec 02 23:36:24 compute-0 useradd[127981]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 02 23:36:24 compute-0 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Dec 02 23:36:24 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 02 23:36:24 compute-0 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Dec 02 23:36:25 compute-0 groupadd[127994]: group added to /etc/group: name=clevis, GID=991
Dec 02 23:36:25 compute-0 groupadd[127994]: group added to /etc/gshadow: name=clevis
Dec 02 23:36:25 compute-0 groupadd[127994]: new group: name=clevis, GID=991
Dec 02 23:36:25 compute-0 useradd[128001]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 02 23:36:25 compute-0 usermod[128011]: add 'clevis' to group 'tss'
Dec 02 23:36:25 compute-0 usermod[128011]: add 'clevis' to shadow group 'tss'
Dec 02 23:36:27 compute-0 polkitd[43720]: Reloading rules
Dec 02 23:36:27 compute-0 polkitd[43720]: Collecting garbage unconditionally...
Dec 02 23:36:27 compute-0 polkitd[43720]: Loading rules from directory /etc/polkit-1/rules.d
Dec 02 23:36:27 compute-0 polkitd[43720]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 02 23:36:27 compute-0 polkitd[43720]: Finished loading, compiling and executing 3 rules
Dec 02 23:36:27 compute-0 polkitd[43720]: Reloading rules
Dec 02 23:36:27 compute-0 polkitd[43720]: Collecting garbage unconditionally...
Dec 02 23:36:27 compute-0 polkitd[43720]: Loading rules from directory /etc/polkit-1/rules.d
Dec 02 23:36:27 compute-0 polkitd[43720]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 02 23:36:27 compute-0 polkitd[43720]: Finished loading, compiling and executing 3 rules
Dec 02 23:36:28 compute-0 groupadd[128198]: group added to /etc/group: name=ceph, GID=167
Dec 02 23:36:28 compute-0 groupadd[128198]: group added to /etc/gshadow: name=ceph
Dec 02 23:36:28 compute-0 groupadd[128198]: new group: name=ceph, GID=167
Dec 02 23:36:28 compute-0 useradd[128204]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Dec 02 23:36:30 compute-0 podman[128211]: 2025-12-02 23:36:30.161606584 +0000 UTC m=+0.113369870 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 02 23:36:31 compute-0 sshd[1004]: Received signal 15; terminating.
Dec 02 23:36:31 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Dec 02 23:36:31 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Dec 02 23:36:31 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Dec 02 23:36:31 compute-0 systemd[1]: sshd.service: Consumed 2.409s CPU time, read 32.0K from disk, written 20.0K to disk.
Dec 02 23:36:31 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Dec 02 23:36:31 compute-0 systemd[1]: Stopping sshd-keygen.target...
Dec 02 23:36:31 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 23:36:31 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 23:36:31 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 02 23:36:31 compute-0 systemd[1]: Reached target sshd-keygen.target.
Dec 02 23:36:31 compute-0 systemd[1]: Starting OpenSSH server daemon...
Dec 02 23:36:31 compute-0 sshd[128750]: Server listening on 0.0.0.0 port 22.
Dec 02 23:36:31 compute-0 sshd[128750]: Server listening on :: port 22.
Dec 02 23:36:31 compute-0 systemd[1]: Started OpenSSH server daemon.
Dec 02 23:36:32 compute-0 podman[128821]: 2025-12-02 23:36:32.249897695 +0000 UTC m=+0.065782544 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 02 23:36:33 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:36:33 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:36:33 compute-0 systemd[1]: Reloading.
Dec 02 23:36:33 compute-0 systemd-rc-local-generator[129029]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:33 compute-0 systemd-sysv-generator[129033]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:33 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:36:37 compute-0 sudo[110804]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:39 compute-0 sudo[136561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwhkmgemxockcalxvtjohjmzehcdpnam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718599.2954323-652-7215461979564/AnsiballZ_systemd.py'
Dec 02 23:36:39 compute-0 sudo[136561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:40 compute-0 python3.9[136594]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:36:40 compute-0 systemd[1]: Reloading.
Dec 02 23:36:40 compute-0 systemd-sysv-generator[137072]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:40 compute-0 systemd-rc-local-generator[137068]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:40 compute-0 sudo[136561]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:41 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:36:41 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:36:41 compute-0 sudo[137758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmhvlglboozcdquggflnvpuehhwyggim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718600.7041605-652-192272132244978/AnsiballZ_systemd.py'
Dec 02 23:36:41 compute-0 systemd[1]: man-db-cache-update.service: Consumed 9.771s CPU time.
Dec 02 23:36:41 compute-0 systemd[1]: run-r86f191bc2da54359be9010b21b9c3b79.service: Deactivated successfully.
Dec 02 23:36:41 compute-0 sudo[137758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:41 compute-0 python3.9[137761]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:36:41 compute-0 systemd[1]: Reloading.
Dec 02 23:36:41 compute-0 systemd-rc-local-generator[137792]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:41 compute-0 systemd-sysv-generator[137796]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:41 compute-0 sudo[137758]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:42 compute-0 sudo[137950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbqvkvwevkpvxwraqrzogwkgtnzfzmbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718601.7545364-652-121381339058849/AnsiballZ_systemd.py'
Dec 02 23:36:42 compute-0 sudo[137950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:42 compute-0 python3.9[137952]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:36:42 compute-0 systemd[1]: Reloading.
Dec 02 23:36:42 compute-0 systemd-rc-local-generator[137985]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:42 compute-0 systemd-sysv-generator[137988]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:42 compute-0 sudo[137950]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:43 compute-0 sudo[138141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoovngdaxfogunvezdzzxyvkpgqkvrdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718602.8630052-652-250647088874078/AnsiballZ_systemd.py'
Dec 02 23:36:43 compute-0 sudo[138141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:43 compute-0 python3.9[138143]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:36:43 compute-0 systemd[1]: Reloading.
Dec 02 23:36:43 compute-0 systemd-rc-local-generator[138174]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:43 compute-0 systemd-sysv-generator[138178]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:43 compute-0 sudo[138141]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:44 compute-0 sudo[138331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uizfbasubxgaipukesqpjjdgxmaajmnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718604.5568693-710-93891929959115/AnsiballZ_systemd.py'
Dec 02 23:36:44 compute-0 sudo[138331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:45 compute-0 python3.9[138333]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:45 compute-0 systemd[1]: Reloading.
Dec 02 23:36:45 compute-0 systemd-rc-local-generator[138359]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:45 compute-0 systemd-sysv-generator[138362]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:45 compute-0 sudo[138331]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:46 compute-0 sudo[138522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmtilgpbmgtoyxolqtcidpevzxquwnld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718605.870454-710-107171552191749/AnsiballZ_systemd.py'
Dec 02 23:36:46 compute-0 sudo[138522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:46 compute-0 python3.9[138524]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:46 compute-0 systemd[1]: Reloading.
Dec 02 23:36:46 compute-0 systemd-rc-local-generator[138557]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:46 compute-0 systemd-sysv-generator[138561]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:46 compute-0 sudo[138522]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:47 compute-0 sudo[138712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnvwwzjgobmhzpdznzdprptkcqupiohz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718607.0439603-710-198058292489503/AnsiballZ_systemd.py'
Dec 02 23:36:47 compute-0 sudo[138712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:47 compute-0 python3.9[138714]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:47 compute-0 systemd[1]: Reloading.
Dec 02 23:36:47 compute-0 systemd-rc-local-generator[138747]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:47 compute-0 systemd-sysv-generator[138750]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:48 compute-0 sudo[138712]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:48 compute-0 sudo[138904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmnhuvofkrqwjowkjgvaqoqfzuqkanfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718608.1808763-710-148147482597752/AnsiballZ_systemd.py'
Dec 02 23:36:48 compute-0 sudo[138904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:48 compute-0 python3.9[138906]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:48 compute-0 sudo[138904]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:49 compute-0 sudo[139059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etamgruyogfahvjqnspnyrfacqzktanq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718609.0411444-710-39622516668713/AnsiballZ_systemd.py'
Dec 02 23:36:49 compute-0 sudo[139059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:49 compute-0 python3.9[139061]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:49 compute-0 systemd[1]: Reloading.
Dec 02 23:36:49 compute-0 systemd-rc-local-generator[139088]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:49 compute-0 systemd-sysv-generator[139093]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:50 compute-0 sudo[139059]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:50 compute-0 sudo[139249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btahpqvbwzvnvzcewqalapotmtpijkcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718610.629002-782-126231638715097/AnsiballZ_systemd.py'
Dec 02 23:36:50 compute-0 sudo[139249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:51 compute-0 python3.9[139251]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 02 23:36:52 compute-0 systemd[1]: Reloading.
Dec 02 23:36:52 compute-0 systemd-rc-local-generator[139278]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:36:52 compute-0 systemd-sysv-generator[139282]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:36:52 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Dec 02 23:36:52 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec 02 23:36:52 compute-0 sudo[139249]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:53 compute-0 sudo[139443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiuadrgxgteiyohboyhexlvelxbeyqwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718612.880027-798-177212466341577/AnsiballZ_systemd.py'
Dec 02 23:36:53 compute-0 sudo[139443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:53 compute-0 python3.9[139445]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:53 compute-0 sudo[139443]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:53 compute-0 sudo[139598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdkqxpqpnthkquliyfqafxvhgbesyxns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718613.6785069-798-246305740120203/AnsiballZ_systemd.py'
Dec 02 23:36:53 compute-0 sudo[139598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:54 compute-0 python3.9[139600]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:54 compute-0 sudo[139598]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:54 compute-0 sudo[139753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isxjgvpbzkhuvukqiabbfafwrwzdfann ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718614.4928107-798-244635714835999/AnsiballZ_systemd.py'
Dec 02 23:36:54 compute-0 sudo[139753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:55 compute-0 python3.9[139755]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:55 compute-0 sudo[139753]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:55 compute-0 sudo[139908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsktlvjvoanewnugugiomnjncfcovzje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718615.4564183-798-14585620510016/AnsiballZ_systemd.py'
Dec 02 23:36:55 compute-0 sudo[139908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:56 compute-0 python3.9[139910]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:56 compute-0 sudo[139908]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:56 compute-0 sudo[140063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdcvroqazqesygyrhgvkguowsagrvqxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718616.3897555-798-91288253666860/AnsiballZ_systemd.py'
Dec 02 23:36:56 compute-0 sudo[140063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:56 compute-0 python3.9[140065]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:57 compute-0 sudo[140063]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:57 compute-0 sudo[140218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wffkoqqvvfaqxmrurbhkiiwwxurwsypm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718617.1282318-798-18600455762522/AnsiballZ_systemd.py'
Dec 02 23:36:57 compute-0 sudo[140218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:57 compute-0 python3.9[140220]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:58 compute-0 sudo[140218]: pam_unix(sudo:session): session closed for user root
Dec 02 23:36:59 compute-0 sudo[140373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yalqgiwalpuuvvoqfbaybuewxvfxqecp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718619.0235522-798-247915527912145/AnsiballZ_systemd.py'
Dec 02 23:36:59 compute-0 sudo[140373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:36:59 compute-0 python3.9[140375]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:36:59 compute-0 sudo[140373]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:00 compute-0 sudo[140528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edoehtxwcaukrscqufvaapoxucjbrbkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718619.8794706-798-110963257522385/AnsiballZ_systemd.py'
Dec 02 23:37:00 compute-0 sudo[140528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:00 compute-0 podman[140530]: 2025-12-02 23:37:00.32923562 +0000 UTC m=+0.085610004 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 02 23:37:00 compute-0 python3.9[140531]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:00 compute-0 sudo[140528]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:37:00.644 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:37:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:37:00.645 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:37:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:37:00.645 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:37:01 compute-0 sudo[140711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgmtyzysretjylikdhwdwrxblxvhuhpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718620.783574-798-191830258002433/AnsiballZ_systemd.py'
Dec 02 23:37:01 compute-0 sudo[140711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:01 compute-0 python3.9[140713]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:02 compute-0 podman[140716]: 2025-12-02 23:37:02.471534875 +0000 UTC m=+0.055357648 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Dec 02 23:37:02 compute-0 sudo[140711]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:02 compute-0 sudo[140887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zubzunmozbsjpvskrhuoukwiplpsfvgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718622.651027-798-195034400754571/AnsiballZ_systemd.py'
Dec 02 23:37:02 compute-0 sudo[140887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:03 compute-0 python3.9[140889]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:04 compute-0 sudo[140887]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:04 compute-0 sudo[141042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dewiqppmulwjurexqmarrhonupgqulaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718624.5241554-798-89239814231755/AnsiballZ_systemd.py'
Dec 02 23:37:04 compute-0 sudo[141042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:05 compute-0 python3.9[141044]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:05 compute-0 sudo[141042]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:05 compute-0 sudo[141197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjswdtvjfrafphwxjxoddtuikxjtsimm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718625.4092038-798-239852500698351/AnsiballZ_systemd.py'
Dec 02 23:37:05 compute-0 sudo[141197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:06 compute-0 python3.9[141199]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:07 compute-0 sudo[141197]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:07 compute-0 sudo[141352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qagchzpqndxvwrpivsqlwekiwcxmleyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718627.4178617-798-164214438560296/AnsiballZ_systemd.py'
Dec 02 23:37:07 compute-0 sudo[141352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:07 compute-0 python3.9[141354]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:08 compute-0 sudo[141352]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:08 compute-0 sudo[141507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-turqzvtzkoeotktszmygxkfyvfteohah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718628.2528694-798-108744172323702/AnsiballZ_systemd.py'
Dec 02 23:37:08 compute-0 sudo[141507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:08 compute-0 python3.9[141509]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 02 23:37:08 compute-0 sudo[141507]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:09 compute-0 sudo[141662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjfcylysxyrcrqqvbvldoejyudfcbmin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718629.4832563-1002-112135703987667/AnsiballZ_file.py'
Dec 02 23:37:09 compute-0 sudo[141662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:09 compute-0 python3.9[141664]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:37:09 compute-0 sudo[141662]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:10 compute-0 sudo[141814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydkglbqayxvvfcnxoswpqudfwvhexhbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718630.1156008-1002-63930164640691/AnsiballZ_file.py'
Dec 02 23:37:10 compute-0 sudo[141814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:10 compute-0 python3.9[141816]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:37:10 compute-0 sudo[141814]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:11 compute-0 sudo[141966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjofrajzpqudldsnbnytobmbfctrgxfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718630.742088-1002-27284202396366/AnsiballZ_file.py'
Dec 02 23:37:11 compute-0 sudo[141966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:11 compute-0 python3.9[141968]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:37:11 compute-0 sudo[141966]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:11 compute-0 sudo[142118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tznvyhxczljmgonampxlckhrmvohphnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718631.3300445-1002-248785459744183/AnsiballZ_file.py'
Dec 02 23:37:11 compute-0 sudo[142118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:11 compute-0 python3.9[142120]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:37:11 compute-0 sudo[142118]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:12 compute-0 sudo[142270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppzekeccbrynhccrhtjrltedixqvwwuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718632.0920727-1002-209913306169393/AnsiballZ_file.py'
Dec 02 23:37:12 compute-0 sudo[142270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:12 compute-0 python3.9[142272]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:37:12 compute-0 sudo[142270]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:13 compute-0 sudo[142422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbuylqxtpjjisummlfgqgzpwpnxatioc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718632.7921293-1002-252103081301735/AnsiballZ_file.py'
Dec 02 23:37:13 compute-0 sudo[142422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:13 compute-0 python3.9[142424]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:37:13 compute-0 sudo[142422]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:14 compute-0 sudo[142574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifjmlxpcvqpguhukvwmdyvpxzppbnwzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718633.6178675-1088-142118804959192/AnsiballZ_stat.py'
Dec 02 23:37:14 compute-0 sudo[142574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:14 compute-0 python3.9[142576]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:14 compute-0 sudo[142574]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:14 compute-0 sudo[142699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujmxyynzsqkshqnwcdjaueuymwemyqdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718633.6178675-1088-142118804959192/AnsiballZ_copy.py'
Dec 02 23:37:14 compute-0 sudo[142699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:15 compute-0 python3.9[142701]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718633.6178675-1088-142118804959192/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:15 compute-0 sudo[142699]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:15 compute-0 sudo[142851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rryqokzhfyeqotgzcllantuigspcqfjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718635.2782834-1088-198412739469687/AnsiballZ_stat.py'
Dec 02 23:37:15 compute-0 sudo[142851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:15 compute-0 python3.9[142853]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:15 compute-0 sudo[142851]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:16 compute-0 sudo[142976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahvkkgpfcprcwposqfddbgirgdmmhulz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718635.2782834-1088-198412739469687/AnsiballZ_copy.py'
Dec 02 23:37:16 compute-0 sudo[142976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:16 compute-0 python3.9[142978]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718635.2782834-1088-198412739469687/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:16 compute-0 sudo[142976]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:16 compute-0 sudo[143128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjpaawxepaupmvihrllnwumtmfduzucz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718636.5569584-1088-95973952398073/AnsiballZ_stat.py'
Dec 02 23:37:16 compute-0 sudo[143128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:17 compute-0 python3.9[143130]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:17 compute-0 sudo[143128]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:17 compute-0 sudo[143253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-himeaaugaludfczzhjvrhterncfxkhxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718636.5569584-1088-95973952398073/AnsiballZ_copy.py'
Dec 02 23:37:17 compute-0 sudo[143253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:17 compute-0 python3.9[143255]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718636.5569584-1088-95973952398073/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:17 compute-0 sudo[143253]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:18 compute-0 sudo[143405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzmojtmtokoadeahstsrjwteatvifiya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718638.0623112-1088-26192013978236/AnsiballZ_stat.py'
Dec 02 23:37:18 compute-0 sudo[143405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:18 compute-0 python3.9[143407]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:18 compute-0 sudo[143405]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:19 compute-0 sudo[143530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbksfchtnwdkydaolfpheiabpijghvtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718638.0623112-1088-26192013978236/AnsiballZ_copy.py'
Dec 02 23:37:19 compute-0 sudo[143530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:19 compute-0 python3.9[143532]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718638.0623112-1088-26192013978236/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:19 compute-0 sudo[143530]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:19 compute-0 sudo[143682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhqibwpeiwlherplbiozyquzgrheqgim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718639.378667-1088-83656100412282/AnsiballZ_stat.py'
Dec 02 23:37:19 compute-0 sudo[143682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:19 compute-0 python3.9[143684]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:19 compute-0 sudo[143682]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:20 compute-0 sudo[143807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwnoirfyukjrgpayycmlwfifrrvcayqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718639.378667-1088-83656100412282/AnsiballZ_copy.py'
Dec 02 23:37:20 compute-0 sudo[143807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:20 compute-0 python3.9[143809]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718639.378667-1088-83656100412282/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:20 compute-0 sudo[143807]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:21 compute-0 sudo[143959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcfubxqfzlvvrproeqlufbjzzenooytj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718640.7313304-1088-86321691277791/AnsiballZ_stat.py'
Dec 02 23:37:21 compute-0 sudo[143959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:21 compute-0 python3.9[143961]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:21 compute-0 sudo[143959]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:21 compute-0 sudo[144084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dccqpelnxyhtnmdlkoirslrperugmjxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718640.7313304-1088-86321691277791/AnsiballZ_copy.py'
Dec 02 23:37:21 compute-0 sudo[144084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:21 compute-0 python3.9[144086]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718640.7313304-1088-86321691277791/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:21 compute-0 sudo[144084]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:22 compute-0 sudo[144236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwcoetzirrwzmxgfrdzftdgwzbrzkjkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718642.1074953-1088-58222968685232/AnsiballZ_stat.py'
Dec 02 23:37:22 compute-0 sudo[144236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:22 compute-0 python3.9[144238]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:22 compute-0 sudo[144236]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:23 compute-0 sudo[144359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loaivagwoefrmsmorzcocthyxlbpbwqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718642.1074953-1088-58222968685232/AnsiballZ_copy.py'
Dec 02 23:37:23 compute-0 sudo[144359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:23 compute-0 python3.9[144361]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718642.1074953-1088-58222968685232/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:23 compute-0 sudo[144359]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:23 compute-0 sudo[144511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvnnkcsxydtrdnsisrabnkepwdxuzumv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718643.5335145-1088-64997483861542/AnsiballZ_stat.py'
Dec 02 23:37:23 compute-0 sudo[144511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:24 compute-0 python3.9[144513]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:24 compute-0 sudo[144511]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:24 compute-0 sudo[144636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxisgicspetewdofpyzklubzdjvppmjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718643.5335145-1088-64997483861542/AnsiballZ_copy.py'
Dec 02 23:37:24 compute-0 sudo[144636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:24 compute-0 python3.9[144638]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764718643.5335145-1088-64997483861542/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:24 compute-0 sudo[144636]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:25 compute-0 sudo[144788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oetwicachhsheuyhpxxvceqbwcsbpvsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718645.4999182-1314-180165130876855/AnsiballZ_command.py'
Dec 02 23:37:25 compute-0 sudo[144788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:25 compute-0 python3.9[144790]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec 02 23:37:26 compute-0 sudo[144788]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:26 compute-0 sudo[144943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djovnhvsdxgswvayjdnztnrafmmluwkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718646.3503027-1332-122796247418678/AnsiballZ_file.py'
Dec 02 23:37:26 compute-0 sudo[144943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:26 compute-0 python3.9[144945]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:26 compute-0 sudo[144943]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:27 compute-0 sshd-session[144816]: Invalid user max from 49.247.36.49 port 36836
Dec 02 23:37:27 compute-0 sudo[145095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpzsfwveastzcnxcoejevzeeoiqwsicb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718647.0515063-1332-59559607813660/AnsiballZ_file.py'
Dec 02 23:37:27 compute-0 sudo[145095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:27 compute-0 sshd-session[144816]: Received disconnect from 49.247.36.49 port 36836:11: Bye Bye [preauth]
Dec 02 23:37:27 compute-0 sshd-session[144816]: Disconnected from invalid user max 49.247.36.49 port 36836 [preauth]
Dec 02 23:37:27 compute-0 python3.9[145097]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:27 compute-0 sudo[145095]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:28 compute-0 sudo[145247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcevbfazmuuaijmngyagpslymimazpda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718647.7508838-1332-139554467527461/AnsiballZ_file.py'
Dec 02 23:37:28 compute-0 sudo[145247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:28 compute-0 python3.9[145249]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:28 compute-0 sudo[145247]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:28 compute-0 sudo[145399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prgbgcpjvuwaezolptgutgrfdefcolsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718648.506423-1332-82521039609622/AnsiballZ_file.py'
Dec 02 23:37:28 compute-0 sudo[145399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:29 compute-0 python3.9[145401]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:29 compute-0 sudo[145399]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:29 compute-0 sudo[145551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdahrkruormpsfrolcocakwycjxlhagd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718649.1833546-1332-7093091809145/AnsiballZ_file.py'
Dec 02 23:37:29 compute-0 sudo[145551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:29 compute-0 python3.9[145553]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:29 compute-0 sudo[145551]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:30 compute-0 sudo[145703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omcsrsfyyrpyiihlbzpnokwmrawkgoqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718649.9177692-1332-157490533783163/AnsiballZ_file.py'
Dec 02 23:37:30 compute-0 sudo[145703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:30 compute-0 python3.9[145705]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:30 compute-0 sudo[145703]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:30 compute-0 sudo[145865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmfnmzeidzgycjgcbopsuwoaapnkoxsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718650.5926263-1332-33644059016748/AnsiballZ_file.py'
Dec 02 23:37:30 compute-0 sudo[145865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:30 compute-0 podman[145829]: 2025-12-02 23:37:30.928306836 +0000 UTC m=+0.089702913 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:37:31 compute-0 python3.9[145874]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:31 compute-0 sudo[145865]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:31 compute-0 sudo[146033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evocjtewisgpldaroafnvrqywbcjirvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718651.201422-1332-94527592631731/AnsiballZ_file.py'
Dec 02 23:37:31 compute-0 sudo[146033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:31 compute-0 python3.9[146035]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:31 compute-0 sudo[146033]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:32 compute-0 sudo[146185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txmxxjrmalmvjdbnmrhqlekibokchgmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718651.8394122-1332-147519480076439/AnsiballZ_file.py'
Dec 02 23:37:32 compute-0 sudo[146185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:32 compute-0 python3.9[146187]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:32 compute-0 sudo[146185]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:32 compute-0 sudo[146346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xabagnpjvvhyzdbqpyjziflxeqkmpmsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718652.5200763-1332-83965643110306/AnsiballZ_file.py'
Dec 02 23:37:32 compute-0 sudo[146346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:32 compute-0 podman[146311]: 2025-12-02 23:37:32.916394499 +0000 UTC m=+0.078744507 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 02 23:37:33 compute-0 python3.9[146358]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:33 compute-0 sudo[146346]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:33 compute-0 sudo[146508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlbbwxsfaxzhxbepnrdxqxkswstjcxfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718653.2565444-1332-226044091716736/AnsiballZ_file.py'
Dec 02 23:37:33 compute-0 sudo[146508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:33 compute-0 python3.9[146510]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:33 compute-0 sudo[146508]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:34 compute-0 sudo[146660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpvkrenqervawvsivfihxcsiwvssbjhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718653.9629457-1332-196435267059032/AnsiballZ_file.py'
Dec 02 23:37:34 compute-0 sudo[146660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:34 compute-0 python3.9[146662]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:34 compute-0 sudo[146660]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:35 compute-0 sudo[146812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyngfchnzgyvwzxjydtilovrwmxzkbcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718654.7168393-1332-213373761961801/AnsiballZ_file.py'
Dec 02 23:37:35 compute-0 sudo[146812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:35 compute-0 python3.9[146814]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:35 compute-0 sudo[146812]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:35 compute-0 sudo[146964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfcsoxbpcercrqfuwacdgowscwswtuyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718655.4021895-1332-150680482404211/AnsiballZ_file.py'
Dec 02 23:37:35 compute-0 sudo[146964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:35 compute-0 python3.9[146966]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:35 compute-0 sudo[146964]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:37 compute-0 sudo[147116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvimkfikvlaztziyiewhheaovhhsoagx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718656.8105001-1530-2988958746941/AnsiballZ_stat.py'
Dec 02 23:37:37 compute-0 sudo[147116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:37 compute-0 python3.9[147118]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:37 compute-0 sudo[147116]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:37 compute-0 sudo[147239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlxnvxqipywvujqmomruuxjjktqkfipk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718656.8105001-1530-2988958746941/AnsiballZ_copy.py'
Dec 02 23:37:37 compute-0 sudo[147239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:37 compute-0 python3.9[147241]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718656.8105001-1530-2988958746941/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:37 compute-0 sudo[147239]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:38 compute-0 sudo[147391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcrbortlcxypchhsgtoomicprmenxixs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718658.1565826-1530-266236002307793/AnsiballZ_stat.py'
Dec 02 23:37:38 compute-0 sudo[147391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:38 compute-0 python3.9[147393]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:38 compute-0 sudo[147391]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:39 compute-0 sudo[147514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqfggvjyqdvbdokhsjoaoqohmoyhjqmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718658.1565826-1530-266236002307793/AnsiballZ_copy.py'
Dec 02 23:37:39 compute-0 sudo[147514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:39 compute-0 python3.9[147516]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718658.1565826-1530-266236002307793/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:39 compute-0 sudo[147514]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:39 compute-0 sudo[147666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqukbkceummpfmlhpqxngorqfkuwpacw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718659.558809-1530-92789348207554/AnsiballZ_stat.py'
Dec 02 23:37:39 compute-0 sudo[147666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:40 compute-0 python3.9[147668]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:40 compute-0 sudo[147666]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:40 compute-0 sudo[147789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqzomqcxbiphkjkaosphdurllqzlnjgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718659.558809-1530-92789348207554/AnsiballZ_copy.py'
Dec 02 23:37:40 compute-0 sudo[147789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:40 compute-0 python3.9[147791]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718659.558809-1530-92789348207554/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:40 compute-0 sudo[147789]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:41 compute-0 sudo[147941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwlvyfikbmtuupmjxxdsbzvpgicgkguo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718661.0043666-1530-90969727759800/AnsiballZ_stat.py'
Dec 02 23:37:41 compute-0 sudo[147941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:41 compute-0 python3.9[147943]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:41 compute-0 sudo[147941]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:41 compute-0 sudo[148064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aclqrtegdqisdhvqgvdjayifzmmrihea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718661.0043666-1530-90969727759800/AnsiballZ_copy.py'
Dec 02 23:37:41 compute-0 sudo[148064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:42 compute-0 python3.9[148066]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718661.0043666-1530-90969727759800/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:42 compute-0 sudo[148064]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:42 compute-0 sudo[148216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebsgzstquggzufadlzduhmykpwdbhuwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718662.3097496-1530-245316694824938/AnsiballZ_stat.py'
Dec 02 23:37:42 compute-0 sudo[148216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:42 compute-0 python3.9[148218]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:42 compute-0 sudo[148216]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:43 compute-0 sudo[148339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngcklmqfkzqdxiavlygenxlhxydlbvnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718662.3097496-1530-245316694824938/AnsiballZ_copy.py'
Dec 02 23:37:43 compute-0 sudo[148339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:43 compute-0 python3.9[148341]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718662.3097496-1530-245316694824938/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:43 compute-0 sudo[148339]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:43 compute-0 sudo[148491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azazxeieyrioubabagtdrpcggopdwnyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718663.6524267-1530-87071410563291/AnsiballZ_stat.py'
Dec 02 23:37:43 compute-0 sudo[148491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:44 compute-0 python3.9[148493]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:44 compute-0 sudo[148491]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:44 compute-0 sudo[148614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obkvagvedlvrbxkfxglmbqqxphobcqsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718663.6524267-1530-87071410563291/AnsiballZ_copy.py'
Dec 02 23:37:44 compute-0 sudo[148614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:44 compute-0 python3.9[148616]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718663.6524267-1530-87071410563291/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:44 compute-0 sudo[148614]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:45 compute-0 sudo[148766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngmtlwsmmwauugwrngonjwsjyenaxcgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718664.8709834-1530-211292403639344/AnsiballZ_stat.py'
Dec 02 23:37:45 compute-0 sudo[148766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:45 compute-0 python3.9[148768]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:45 compute-0 sudo[148766]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:45 compute-0 sudo[148889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuiohodtazsjeupxtkeorswuilzeucvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718664.8709834-1530-211292403639344/AnsiballZ_copy.py'
Dec 02 23:37:45 compute-0 sudo[148889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:46 compute-0 python3.9[148891]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718664.8709834-1530-211292403639344/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:46 compute-0 sudo[148889]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:46 compute-0 sudo[149041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfygzuztncdcjrghstrowhjdvgwsfhnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718666.231821-1530-71706010793535/AnsiballZ_stat.py'
Dec 02 23:37:46 compute-0 sudo[149041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:46 compute-0 python3.9[149043]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:46 compute-0 sudo[149041]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:47 compute-0 sudo[149164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jegmobccxjmemxabqftlqwqddiksiuyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718666.231821-1530-71706010793535/AnsiballZ_copy.py'
Dec 02 23:37:47 compute-0 sudo[149164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:47 compute-0 python3.9[149166]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718666.231821-1530-71706010793535/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:47 compute-0 sudo[149164]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:47 compute-0 sudo[149316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbjnmshqfbixlomvjejvugacvzqvrurg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718667.6214538-1530-62130736954105/AnsiballZ_stat.py'
Dec 02 23:37:47 compute-0 sudo[149316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:48 compute-0 python3.9[149318]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:48 compute-0 sudo[149316]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:48 compute-0 sudo[149441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alnvskgvtkgsgxccmmsbazsrpcwdfmbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718667.6214538-1530-62130736954105/AnsiballZ_copy.py'
Dec 02 23:37:48 compute-0 sudo[149441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:48 compute-0 python3.9[149443]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718667.6214538-1530-62130736954105/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:48 compute-0 sudo[149441]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:49 compute-0 sudo[149594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqydnhqgkpiygacfpnbtdtnrvsejprbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718669.185011-1530-159118472881951/AnsiballZ_stat.py'
Dec 02 23:37:49 compute-0 sudo[149594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:49 compute-0 python3.9[149596]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:49 compute-0 sudo[149594]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:50 compute-0 sudo[149717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muxnabowtaqsatmrznuoyzkejlpkfpsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718669.185011-1530-159118472881951/AnsiballZ_copy.py'
Dec 02 23:37:50 compute-0 sudo[149717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:50 compute-0 python3.9[149719]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718669.185011-1530-159118472881951/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:50 compute-0 sudo[149717]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:50 compute-0 sshd-session[149319]: Received disconnect from 45.78.218.154 port 60954:11: Bye Bye [preauth]
Dec 02 23:37:50 compute-0 sshd-session[149319]: Disconnected from authenticating user root 45.78.218.154 port 60954 [preauth]
Dec 02 23:37:50 compute-0 sudo[149869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fszajjdtsnlcorwyrtgsjcqnwpbrcdom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718670.4682152-1530-226040492730231/AnsiballZ_stat.py'
Dec 02 23:37:50 compute-0 sudo[149869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:50 compute-0 python3.9[149871]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:50 compute-0 sudo[149869]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:51 compute-0 sudo[149992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyxkzccezzonfrzxzyzzmklzwpltqgci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718670.4682152-1530-226040492730231/AnsiballZ_copy.py'
Dec 02 23:37:51 compute-0 sudo[149992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:51 compute-0 python3.9[149994]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718670.4682152-1530-226040492730231/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:51 compute-0 sudo[149992]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:52 compute-0 sudo[150144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cazzsthzvtwmeduyupbuzvffxrrdymyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718671.7855167-1530-134535833964664/AnsiballZ_stat.py'
Dec 02 23:37:52 compute-0 sudo[150144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:52 compute-0 python3.9[150146]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:52 compute-0 sudo[150144]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:52 compute-0 sudo[150267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkfjnncabeumbasbwxrovinzulunewow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718671.7855167-1530-134535833964664/AnsiballZ_copy.py'
Dec 02 23:37:52 compute-0 sudo[150267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:52 compute-0 python3.9[150269]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718671.7855167-1530-134535833964664/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:52 compute-0 sudo[150267]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:53 compute-0 sudo[150419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtcuzaacnincbresuxjtbfzhrsrtnpjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718672.9428897-1530-119373512921200/AnsiballZ_stat.py'
Dec 02 23:37:53 compute-0 sudo[150419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:53 compute-0 python3.9[150421]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:53 compute-0 sudo[150419]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:53 compute-0 sudo[150542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwtonkqoimilmpjcexfwqxinysdxyuvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718672.9428897-1530-119373512921200/AnsiballZ_copy.py'
Dec 02 23:37:53 compute-0 sudo[150542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:54 compute-0 python3.9[150544]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718672.9428897-1530-119373512921200/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:54 compute-0 sudo[150542]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:54 compute-0 sudo[150694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwvieswcbockgqpwzdivjzldgkujrfjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718674.2404518-1530-159126518620848/AnsiballZ_stat.py'
Dec 02 23:37:54 compute-0 sudo[150694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:54 compute-0 python3.9[150696]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:37:54 compute-0 sudo[150694]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:55 compute-0 sudo[150817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-figgyiccjkpgjndzvqdsbleaboisbwyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718674.2404518-1530-159126518620848/AnsiballZ_copy.py'
Dec 02 23:37:55 compute-0 sudo[150817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:55 compute-0 python3.9[150819]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718674.2404518-1530-159126518620848/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:55 compute-0 sudo[150817]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:56 compute-0 python3.9[150969]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:37:56 compute-0 sudo[151122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfrizivkvwoonjpvugnxhxhplfkwcadz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718676.3865297-1942-178935131459816/AnsiballZ_seboolean.py'
Dec 02 23:37:56 compute-0 sudo[151122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:57 compute-0 python3.9[151124]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 02 23:37:58 compute-0 sudo[151122]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:59 compute-0 sudo[151278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tecslpkmtcntzlvjkeitghrtqyirdyww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718678.6925085-1958-144378636848737/AnsiballZ_copy.py'
Dec 02 23:37:59 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec 02 23:37:59 compute-0 sudo[151278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:37:59 compute-0 python3.9[151280]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:37:59 compute-0 sudo[151278]: pam_unix(sudo:session): session closed for user root
Dec 02 23:37:59 compute-0 sudo[151430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjwzzsysyxkjnatzivjojrmywnmjtaby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718679.55234-1958-62140607878789/AnsiballZ_copy.py'
Dec 02 23:37:59 compute-0 sudo[151430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:00 compute-0 python3.9[151432]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:00 compute-0 sudo[151430]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:38:00.647 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:38:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:38:00.648 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:38:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:38:00.649 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:38:00 compute-0 sudo[151583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eebyswqogitulmmoqynkozxriusbfvnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718680.391465-1958-9290902162081/AnsiballZ_copy.py'
Dec 02 23:38:00 compute-0 sudo[151583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:01 compute-0 python3.9[151585]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:01 compute-0 sudo[151583]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:01 compute-0 podman[151586]: 2025-12-02 23:38:01.201339327 +0000 UTC m=+0.135933347 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 02 23:38:01 compute-0 anacron[7485]: Job `cron.daily' started
Dec 02 23:38:01 compute-0 anacron[7485]: Job `cron.daily' terminated
Dec 02 23:38:01 compute-0 sudo[151763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrddbivyciefysfoaqowqokitrdddkrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718681.2485616-1958-217688713971433/AnsiballZ_copy.py'
Dec 02 23:38:01 compute-0 sudo[151763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:01 compute-0 python3.9[151765]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:01 compute-0 sudo[151763]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:02 compute-0 sudo[151915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pexwfkriqosyjjiajbicocuxtmiggjkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718681.980557-1958-65295276464686/AnsiballZ_copy.py'
Dec 02 23:38:02 compute-0 sudo[151915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:02 compute-0 python3.9[151917]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:02 compute-0 sudo[151915]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:03 compute-0 podman[152018]: 2025-12-02 23:38:03.15619731 +0000 UTC m=+0.104426001 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 23:38:03 compute-0 sudo[152086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqmhyvybmnqprgbqawhcyswlkubpwmdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718682.8038402-2030-177154567524692/AnsiballZ_copy.py'
Dec 02 23:38:03 compute-0 sudo[152086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:03 compute-0 python3.9[152088]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:03 compute-0 sudo[152086]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:04 compute-0 sudo[152238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzjjsqttgpidezjsrgudyuwxekedjcti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718683.6204526-2030-4196487467919/AnsiballZ_copy.py'
Dec 02 23:38:04 compute-0 sudo[152238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:04 compute-0 python3.9[152240]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:04 compute-0 sudo[152238]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:04 compute-0 sudo[152390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alsijktoboaqstxhunurkblxufgcxvqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718684.4649928-2030-218785276224173/AnsiballZ_copy.py'
Dec 02 23:38:04 compute-0 sudo[152390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:05 compute-0 python3.9[152392]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:05 compute-0 sudo[152390]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:05 compute-0 sudo[152542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgqqizqfcpljkoszpmikbnxuoedyplec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718685.2153492-2030-111816224097629/AnsiballZ_copy.py'
Dec 02 23:38:05 compute-0 sudo[152542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:05 compute-0 python3.9[152544]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:05 compute-0 sudo[152542]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:06 compute-0 sudo[152694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuszvpioieljgfpqoahtqqkruvkthzyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718685.8442929-2030-164358318249595/AnsiballZ_copy.py'
Dec 02 23:38:06 compute-0 sudo[152694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:06 compute-0 python3.9[152696]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:06 compute-0 sudo[152694]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:07 compute-0 sudo[152846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icnroeaiybiyiopauiefzyuxzchouzoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718686.8180213-2102-48283765989435/AnsiballZ_systemd.py'
Dec 02 23:38:07 compute-0 sudo[152846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:07 compute-0 python3.9[152848]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:38:07 compute-0 systemd[1]: Reloading.
Dec 02 23:38:07 compute-0 systemd-rc-local-generator[152876]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:07 compute-0 systemd-sysv-generator[152879]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:07 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Dec 02 23:38:07 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Dec 02 23:38:07 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Dec 02 23:38:07 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 02 23:38:07 compute-0 systemd[1]: Starting libvirt logging daemon...
Dec 02 23:38:07 compute-0 systemd[1]: Started libvirt logging daemon.
Dec 02 23:38:07 compute-0 sudo[152846]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:08 compute-0 sudo[153039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgkicmjxolowutoqdmanxlewmrqtxmcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718687.976282-2102-87446923720933/AnsiballZ_systemd.py'
Dec 02 23:38:08 compute-0 sudo[153039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:08 compute-0 python3.9[153041]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:38:08 compute-0 systemd[1]: Reloading.
Dec 02 23:38:08 compute-0 systemd-rc-local-generator[153071]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:08 compute-0 systemd-sysv-generator[153074]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:08 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Dec 02 23:38:08 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 02 23:38:08 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 02 23:38:08 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 02 23:38:08 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 02 23:38:08 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 02 23:38:08 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 02 23:38:08 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 02 23:38:09 compute-0 sudo[153039]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:09 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 02 23:38:09 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 02 23:38:09 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 02 23:38:09 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 02 23:38:09 compute-0 sudo[153258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbkyceoyuyumxdhucyzsljvpwtdlyhjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718689.1782324-2102-19672069133102/AnsiballZ_systemd.py'
Dec 02 23:38:09 compute-0 sudo[153258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:09 compute-0 python3.9[153264]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:38:09 compute-0 systemd[1]: Reloading.
Dec 02 23:38:09 compute-0 systemd-rc-local-generator[153292]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:09 compute-0 systemd-sysv-generator[153295]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:10 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 02 23:38:10 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 02 23:38:10 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 02 23:38:10 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 02 23:38:10 compute-0 systemd[1]: Starting libvirt proxy daemon...
Dec 02 23:38:10 compute-0 systemd[1]: Started libvirt proxy daemon.
Dec 02 23:38:10 compute-0 sudo[153258]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:10 compute-0 setroubleshoot[153103]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a8bc38c9-3808-4fbd-a707-f0a5241174d0
Dec 02 23:38:10 compute-0 setroubleshoot[153103]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 02 23:38:10 compute-0 setroubleshoot[153103]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a8bc38c9-3808-4fbd-a707-f0a5241174d0
Dec 02 23:38:10 compute-0 setroubleshoot[153103]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 02 23:38:10 compute-0 sudo[153476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evxbkmwmffbpbbrwgdvtropjiqsgijxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718690.3360834-2102-126064114617970/AnsiballZ_systemd.py'
Dec 02 23:38:10 compute-0 sudo[153476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:10 compute-0 python3.9[153478]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:38:10 compute-0 systemd[1]: Reloading.
Dec 02 23:38:11 compute-0 systemd-rc-local-generator[153502]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:11 compute-0 systemd-sysv-generator[153506]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:11 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Dec 02 23:38:11 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Dec 02 23:38:11 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 02 23:38:11 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 02 23:38:11 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 02 23:38:11 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 02 23:38:11 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 02 23:38:11 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 02 23:38:11 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 02 23:38:11 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 02 23:38:11 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 02 23:38:11 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 02 23:38:11 compute-0 sudo[153476]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:11 compute-0 sudo[153691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aovuyqujlbxirdgwiuvwciwrinuppmgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718691.4942048-2102-52120597225050/AnsiballZ_systemd.py'
Dec 02 23:38:11 compute-0 sudo[153691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:12 compute-0 python3.9[153693]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:38:12 compute-0 systemd[1]: Reloading.
Dec 02 23:38:12 compute-0 systemd-rc-local-generator[153718]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:12 compute-0 systemd-sysv-generator[153723]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:12 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Dec 02 23:38:12 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Dec 02 23:38:12 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Dec 02 23:38:12 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 02 23:38:12 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 02 23:38:12 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 02 23:38:12 compute-0 systemd[1]: Starting libvirt secret daemon...
Dec 02 23:38:12 compute-0 systemd[1]: Started libvirt secret daemon.
Dec 02 23:38:12 compute-0 sudo[153691]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:14 compute-0 sudo[153903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dufrdnbmglhblkhtstvkeflayivqkqrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718693.9589634-2176-274654471706502/AnsiballZ_file.py'
Dec 02 23:38:14 compute-0 sudo[153903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:14 compute-0 python3.9[153905]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:14 compute-0 sudo[153903]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:15 compute-0 sudo[154055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzqhdwssogppttyzaghpjcxgbxnpxjaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718694.8622396-2192-259409859855840/AnsiballZ_find.py'
Dec 02 23:38:15 compute-0 sudo[154055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:15 compute-0 python3.9[154057]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 23:38:15 compute-0 sudo[154055]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:16 compute-0 sudo[154207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etzgxtdkqaddwlfiwcfpbtqthttixotl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718695.9753072-2220-95286894836642/AnsiballZ_stat.py'
Dec 02 23:38:16 compute-0 sudo[154207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:16 compute-0 python3.9[154209]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:16 compute-0 sudo[154207]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:16 compute-0 sudo[154330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mluhhgdbllqthsaflhiyqxguerwvkmpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718695.9753072-2220-95286894836642/AnsiballZ_copy.py'
Dec 02 23:38:16 compute-0 sudo[154330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:17 compute-0 python3.9[154332]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718695.9753072-2220-95286894836642/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:17 compute-0 sudo[154330]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:17 compute-0 sudo[154482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qffabcrayaqjchkhxhjqphcqbtzrdycb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718697.591281-2252-83898446912605/AnsiballZ_file.py'
Dec 02 23:38:17 compute-0 sudo[154482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:18 compute-0 python3.9[154484]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:18 compute-0 sudo[154482]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:18 compute-0 sudo[154634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftcktobmhkcorbryrsdoamtxfgsjffgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718698.4904947-2268-222355295480709/AnsiballZ_stat.py'
Dec 02 23:38:18 compute-0 sudo[154634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:19 compute-0 python3.9[154636]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:19 compute-0 sudo[154634]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:19 compute-0 sudo[154712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yljbmevdxxsdncipmaumqialbteyprej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718698.4904947-2268-222355295480709/AnsiballZ_file.py'
Dec 02 23:38:19 compute-0 sudo[154712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:19 compute-0 python3.9[154714]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:19 compute-0 sudo[154712]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:20 compute-0 sudo[154864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtechsmehpyjpneoycrrnwefjffkknzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718699.9682658-2292-157055163136686/AnsiballZ_stat.py'
Dec 02 23:38:20 compute-0 sudo[154864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:20 compute-0 python3.9[154866]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:20 compute-0 sudo[154864]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:20 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 02 23:38:20 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 02 23:38:20 compute-0 sudo[154943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjegbzyiaiychnlrgpdggieerbmzngww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718699.9682658-2292-157055163136686/AnsiballZ_file.py'
Dec 02 23:38:20 compute-0 sudo[154943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:20 compute-0 python3.9[154945]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.4osmqphe recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:20 compute-0 sudo[154943]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:21 compute-0 sudo[155095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fozyqswhoolhasvhtduofhgqdpfvusfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718701.3794777-2316-202426201671456/AnsiballZ_stat.py'
Dec 02 23:38:21 compute-0 sudo[155095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:21 compute-0 python3.9[155097]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:21 compute-0 sudo[155095]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:22 compute-0 sudo[155173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mihveaybsbxctfwdzmxrpeynxjpvbsdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718701.3794777-2316-202426201671456/AnsiballZ_file.py'
Dec 02 23:38:22 compute-0 sudo[155173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:22 compute-0 python3.9[155175]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:22 compute-0 sudo[155173]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:23 compute-0 sudo[155325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noskmzjxjvyeprtsgpelwdorglihhhdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718702.8609593-2342-180901089828022/AnsiballZ_command.py'
Dec 02 23:38:23 compute-0 sudo[155325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:23 compute-0 python3.9[155327]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:38:23 compute-0 sudo[155325]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:24 compute-0 sudo[155478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sigeybsbvyrtteinnbmlenxhcbutwkes ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718703.730305-2358-49571880667107/AnsiballZ_edpm_nftables_from_files.py'
Dec 02 23:38:24 compute-0 sudo[155478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:24 compute-0 python3[155480]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 02 23:38:24 compute-0 sudo[155478]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:25 compute-0 sudo[155630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiorvmwqxrvfzipdauyvlhslitysxjwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718704.7043347-2374-230633642594983/AnsiballZ_stat.py'
Dec 02 23:38:25 compute-0 sudo[155630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:25 compute-0 python3.9[155632]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:25 compute-0 sudo[155630]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:25 compute-0 sudo[155708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcmhrtyffoyujulipiqbcsweonbseapy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718704.7043347-2374-230633642594983/AnsiballZ_file.py'
Dec 02 23:38:25 compute-0 sudo[155708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:25 compute-0 python3.9[155710]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:25 compute-0 sudo[155708]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:26 compute-0 sudo[155860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujepysulrnaxryasnvizuafdxjytazgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718706.1761043-2398-180878922502564/AnsiballZ_stat.py'
Dec 02 23:38:26 compute-0 sudo[155860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:26 compute-0 python3.9[155862]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:26 compute-0 sudo[155860]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:26 compute-0 sudo[155938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lblxgzahnrtahxycjkhglylbrnyipcgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718706.1761043-2398-180878922502564/AnsiballZ_file.py'
Dec 02 23:38:26 compute-0 sudo[155938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:27 compute-0 python3.9[155940]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:27 compute-0 sudo[155938]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:28 compute-0 sudo[156090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltuanymhefazbhgznrbzvtilqtcyctmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718707.7193606-2422-164105294356073/AnsiballZ_stat.py'
Dec 02 23:38:28 compute-0 sudo[156090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:28 compute-0 python3.9[156092]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:28 compute-0 sudo[156090]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:28 compute-0 sudo[156168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfcukvvypstqizmkdjmyzjcywbzqxkay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718707.7193606-2422-164105294356073/AnsiballZ_file.py'
Dec 02 23:38:28 compute-0 sudo[156168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:28 compute-0 python3.9[156170]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:28 compute-0 sudo[156168]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:29 compute-0 sudo[156320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrguzreufzodqzsyvdiahnkclrzipjmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718709.1668298-2446-95965463335139/AnsiballZ_stat.py'
Dec 02 23:38:29 compute-0 sudo[156320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:29 compute-0 python3.9[156322]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:29 compute-0 sudo[156320]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:29 compute-0 sudo[156398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzousvhmyldkpvbbpjkgwjyjiznezptq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718709.1668298-2446-95965463335139/AnsiballZ_file.py'
Dec 02 23:38:29 compute-0 sudo[156398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:30 compute-0 python3.9[156400]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:30 compute-0 sudo[156398]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:31 compute-0 sudo[156550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjajihvtcfdulqoxblpvlnvpqolqlvpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718710.6254556-2470-152094805713788/AnsiballZ_stat.py'
Dec 02 23:38:31 compute-0 sudo[156550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:31 compute-0 python3.9[156552]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:31 compute-0 sudo[156550]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:31 compute-0 sudo[156686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxdbceaypocgeyrmofujwueazlpqjsvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718710.6254556-2470-152094805713788/AnsiballZ_copy.py'
Dec 02 23:38:31 compute-0 sudo[156686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:31 compute-0 podman[156649]: 2025-12-02 23:38:31.976458135 +0000 UTC m=+0.098873315 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 02 23:38:32 compute-0 python3.9[156692]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764718710.6254556-2470-152094805713788/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:32 compute-0 sudo[156686]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:32 compute-0 sudo[156853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojqpezeyayehxvugbzgqebykyhzzmfdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718712.3121796-2500-52387549373801/AnsiballZ_file.py'
Dec 02 23:38:32 compute-0 sudo[156853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:32 compute-0 python3.9[156855]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:32 compute-0 sudo[156853]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:33 compute-0 sudo[157020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjtzsafimpyspqgzsytpnwfpozetpmhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718713.2774749-2516-272738805796310/AnsiballZ_command.py'
Dec 02 23:38:33 compute-0 sudo[157020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:33 compute-0 podman[156979]: 2025-12-02 23:38:33.610276371 +0000 UTC m=+0.083323608 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:38:33 compute-0 python3.9[157024]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:38:33 compute-0 sudo[157020]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:34 compute-0 sudo[157177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytwhdbuuoumeuwjyvupujgfvlyydftzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718713.9985557-2532-148468007426284/AnsiballZ_blockinfile.py'
Dec 02 23:38:34 compute-0 sudo[157177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:34 compute-0 python3.9[157179]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:34 compute-0 sudo[157177]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:35 compute-0 sudo[157329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdqsfjruwmfmxngucnchlhkxtjldwlae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718714.9401917-2550-84768673579345/AnsiballZ_command.py'
Dec 02 23:38:35 compute-0 sudo[157329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:35 compute-0 python3.9[157331]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:38:35 compute-0 sudo[157329]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:36 compute-0 sudo[157482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvgvxvtehzuuijhvrgldjlwxmbsnyjrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718715.723298-2566-171606689507457/AnsiballZ_stat.py'
Dec 02 23:38:36 compute-0 sudo[157482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:36 compute-0 python3.9[157484]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:38:36 compute-0 sudo[157482]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:36 compute-0 sudo[157636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bysulrjyrfboeibkzvtuzfjjtxslatmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718716.4268-2582-110502107211074/AnsiballZ_command.py'
Dec 02 23:38:36 compute-0 sudo[157636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:36 compute-0 python3.9[157638]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:38:36 compute-0 sudo[157636]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:37 compute-0 sudo[157791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnfugnjgdcgzthsccjajgxgwbbdinxwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718717.345856-2598-75032380615374/AnsiballZ_file.py'
Dec 02 23:38:37 compute-0 sudo[157791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:37 compute-0 python3.9[157793]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:37 compute-0 sudo[157791]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:38 compute-0 sudo[157943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obheysedlhjmgtqsmdmsrbavlbdiilla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718718.124642-2614-120198236076409/AnsiballZ_stat.py'
Dec 02 23:38:38 compute-0 sudo[157943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:38 compute-0 python3.9[157945]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:38 compute-0 sudo[157943]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:39 compute-0 sudo[158066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pybftrnnymzsegbkajmxhgstrjdpgbms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718718.124642-2614-120198236076409/AnsiballZ_copy.py'
Dec 02 23:38:39 compute-0 sudo[158066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:39 compute-0 python3.9[158068]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718718.124642-2614-120198236076409/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:39 compute-0 sudo[158066]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:40 compute-0 sudo[158218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftuzlzzosopesfnmexuwepdhwgcilkcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718719.6106014-2644-77291860260312/AnsiballZ_stat.py'
Dec 02 23:38:40 compute-0 sudo[158218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:40 compute-0 python3.9[158220]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:40 compute-0 sudo[158218]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:40 compute-0 sudo[158341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xukxbofdolrwsynexinmbrydzvjqijik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718719.6106014-2644-77291860260312/AnsiballZ_copy.py'
Dec 02 23:38:40 compute-0 sudo[158341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:40 compute-0 python3.9[158343]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718719.6106014-2644-77291860260312/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:40 compute-0 sudo[158341]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:41 compute-0 sudo[158493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scatmzzbshnhjmpaiyssxrpjprmjxvdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718721.259187-2674-220855515059924/AnsiballZ_stat.py'
Dec 02 23:38:41 compute-0 sudo[158493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:41 compute-0 python3.9[158495]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:38:41 compute-0 sudo[158493]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:42 compute-0 sudo[158616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itobrymvbvvpdsifsnxsnjbepmwqjipj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718721.259187-2674-220855515059924/AnsiballZ_copy.py'
Dec 02 23:38:42 compute-0 sudo[158616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:42 compute-0 python3.9[158618]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718721.259187-2674-220855515059924/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:38:42 compute-0 sudo[158616]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:43 compute-0 sudo[158768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcmigwlydxtjibtrkaeqssbuuuesggje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718722.7872176-2704-113951835989704/AnsiballZ_systemd.py'
Dec 02 23:38:43 compute-0 sudo[158768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:43 compute-0 python3.9[158770]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:38:43 compute-0 systemd[1]: Reloading.
Dec 02 23:38:43 compute-0 systemd-rc-local-generator[158798]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:43 compute-0 systemd-sysv-generator[158801]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:43 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Dec 02 23:38:43 compute-0 sudo[158768]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:44 compute-0 sudo[158959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpkhpdnelvxqeoxgoexrsaxwegkzkfsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718724.1903129-2720-113961528128892/AnsiballZ_systemd.py'
Dec 02 23:38:44 compute-0 sudo[158959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:38:44 compute-0 python3.9[158961]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 02 23:38:44 compute-0 systemd[1]: Reloading.
Dec 02 23:38:44 compute-0 systemd-sysv-generator[158994]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:44 compute-0 systemd-rc-local-generator[158990]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:45 compute-0 systemd[1]: Reloading.
Dec 02 23:38:45 compute-0 systemd-rc-local-generator[159027]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:38:45 compute-0 systemd-sysv-generator[159031]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:38:45 compute-0 sudo[158959]: pam_unix(sudo:session): session closed for user root
Dec 02 23:38:46 compute-0 sshd-session[104507]: Connection closed by 192.168.122.30 port 60008
Dec 02 23:38:46 compute-0 sshd-session[104504]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:38:46 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Dec 02 23:38:46 compute-0 systemd[1]: session-23.scope: Consumed 3min 34.627s CPU time.
Dec 02 23:38:46 compute-0 systemd-logind[795]: Session 23 logged out. Waiting for processes to exit.
Dec 02 23:38:46 compute-0 systemd-logind[795]: Removed session 23.
Dec 02 23:38:51 compute-0 sshd-session[159057]: Accepted publickey for zuul from 192.168.122.30 port 41530 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:38:51 compute-0 systemd-logind[795]: New session 24 of user zuul.
Dec 02 23:38:51 compute-0 systemd[1]: Started Session 24 of User zuul.
Dec 02 23:38:51 compute-0 sshd-session[159057]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:38:52 compute-0 python3.9[159212]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:38:53 compute-0 sshd-session[159113]: Invalid user temp from 49.247.36.49 port 42208
Dec 02 23:38:53 compute-0 sshd-session[159113]: Received disconnect from 49.247.36.49 port 42208:11: Bye Bye [preauth]
Dec 02 23:38:53 compute-0 sshd-session[159113]: Disconnected from invalid user temp 49.247.36.49 port 42208 [preauth]
Dec 02 23:38:54 compute-0 python3.9[159366]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:38:54 compute-0 network[159383]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:38:54 compute-0 network[159384]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:38:54 compute-0 network[159385]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:39:00 compute-0 sudo[159655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uupgfxgqjcheyycorydmujgatepevkmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718740.0908892-74-105588651822363/AnsiballZ_setup.py'
Dec 02 23:39:00 compute-0 sudo[159655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:39:00.650 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:39:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:39:00.651 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:39:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:39:00.651 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:39:00 compute-0 python3.9[159657]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 02 23:39:01 compute-0 sudo[159655]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:01 compute-0 sudo[159740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeexitlyuxykgdxkbbikrhpmyrcrtwah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718740.0908892-74-105588651822363/AnsiballZ_dnf.py'
Dec 02 23:39:01 compute-0 sudo[159740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:01 compute-0 python3.9[159742]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:39:02 compute-0 podman[159744]: 2025-12-02 23:39:02.186635734 +0000 UTC m=+0.131205986 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4)
Dec 02 23:39:04 compute-0 podman[159770]: 2025-12-02 23:39:04.112430598 +0000 UTC m=+0.065798137 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Dec 02 23:39:06 compute-0 sudo[159740]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:07 compute-0 sudo[159939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpiysszqdlzualipxkwylvmcnjtihoxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718747.327141-98-82693258734537/AnsiballZ_stat.py'
Dec 02 23:39:07 compute-0 sudo[159939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:08 compute-0 python3.9[159941]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:39:08 compute-0 sudo[159939]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:08 compute-0 sudo[160091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-benuzivnbtdgzamahubyjiuejsdpewkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718748.337929-118-177455787950581/AnsiballZ_command.py'
Dec 02 23:39:08 compute-0 sudo[160091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:08 compute-0 python3.9[160093]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:39:08 compute-0 sudo[160091]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:09 compute-0 sudo[160244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-impzsvzviuzvyqnmqgedbtrnvaickgom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718749.4032717-138-179310733661488/AnsiballZ_stat.py'
Dec 02 23:39:09 compute-0 sudo[160244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:09 compute-0 python3.9[160246]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:39:09 compute-0 sudo[160244]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:10 compute-0 sudo[160396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjlzbtotxpelceweguawfjofwobdhcuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718750.2958946-154-231104993874214/AnsiballZ_command.py'
Dec 02 23:39:10 compute-0 sudo[160396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:10 compute-0 python3.9[160398]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:39:10 compute-0 sudo[160396]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:11 compute-0 sudo[160549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dujdjsdzmuueugtczzzkgpywfmrgsgbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718751.0727782-170-260270080817553/AnsiballZ_stat.py'
Dec 02 23:39:11 compute-0 sudo[160549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:11 compute-0 python3.9[160551]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:11 compute-0 sudo[160549]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:12 compute-0 sudo[160672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzuwdzydtovcrorpxoltyviiehmdugwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718751.0727782-170-260270080817553/AnsiballZ_copy.py'
Dec 02 23:39:12 compute-0 sudo[160672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:12 compute-0 python3.9[160674]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718751.0727782-170-260270080817553/.source.iscsi _original_basename=.y7ze5a5_ follow=False checksum=dfd6d9ae9d8c54b448a9ec30e9c1d654031baa70 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:12 compute-0 sudo[160672]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:13 compute-0 sudo[160824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdfceiyoxfsaddrymkanaqixalztcjlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718752.5576-200-215050357845801/AnsiballZ_file.py'
Dec 02 23:39:13 compute-0 sudo[160824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:13 compute-0 python3.9[160826]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:13 compute-0 sudo[160824]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:14 compute-0 sudo[160976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phtwlgpnioyrzzqnbzlpivwdphyjbidw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718753.551155-216-33737835612886/AnsiballZ_lineinfile.py'
Dec 02 23:39:14 compute-0 sudo[160976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:14 compute-0 python3.9[160978]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:14 compute-0 sudo[160976]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:14 compute-0 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:39:15 compute-0 sudo[161129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djiasnywtgulizxuudogixgmjsholxub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718754.676921-234-44236145764601/AnsiballZ_systemd_service.py'
Dec 02 23:39:15 compute-0 sudo[161129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:15 compute-0 python3.9[161131]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:39:15 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 02 23:39:15 compute-0 sudo[161129]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:17 compute-0 sudo[161285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxfoeztpreiquaakhfdexpcnxaqhkxvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718756.932298-250-111305832596367/AnsiballZ_systemd_service.py'
Dec 02 23:39:17 compute-0 sudo[161285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:17 compute-0 python3.9[161287]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:39:17 compute-0 systemd[1]: Reloading.
Dec 02 23:39:17 compute-0 systemd-rc-local-generator[161307]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:39:17 compute-0 systemd-sysv-generator[161316]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:39:17 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 02 23:39:17 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 02 23:39:17 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Dec 02 23:39:17 compute-0 systemd[1]: Started Open-iSCSI.
Dec 02 23:39:17 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 02 23:39:17 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 02 23:39:17 compute-0 sudo[161285]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:18 compute-0 sudo[161485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaqoltlddvboxwggpexhkoyfjazcragp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718758.5384762-272-50268289241547/AnsiballZ_service_facts.py'
Dec 02 23:39:18 compute-0 sudo[161485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:19 compute-0 python3.9[161487]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:39:19 compute-0 network[161504]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:39:19 compute-0 network[161505]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:39:19 compute-0 network[161506]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:39:23 compute-0 sudo[161485]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:24 compute-0 sudo[161775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttgqmzfwosemsezwnktslmqvxleryywz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718763.9080088-292-111871593727213/AnsiballZ_file.py'
Dec 02 23:39:24 compute-0 sudo[161775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:24 compute-0 python3.9[161777]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 23:39:24 compute-0 sudo[161775]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:25 compute-0 sudo[161927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsrqfehqklamjzujfmccfyxfkgmmzoup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718764.8359299-308-260891551449508/AnsiballZ_modprobe.py'
Dec 02 23:39:25 compute-0 sudo[161927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:25 compute-0 python3.9[161929]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 02 23:39:25 compute-0 sudo[161927]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:26 compute-0 sudo[162083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frgnprfwgtoqiuqjnjvpumbldnwhaodi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718765.781254-324-161229209940869/AnsiballZ_stat.py'
Dec 02 23:39:26 compute-0 sudo[162083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:26 compute-0 python3.9[162085]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:26 compute-0 sudo[162083]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:26 compute-0 sudo[162206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giqryizdsbblecvczagzgcqshzvjxwng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718765.781254-324-161229209940869/AnsiballZ_copy.py'
Dec 02 23:39:26 compute-0 sudo[162206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:26 compute-0 python3.9[162208]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718765.781254-324-161229209940869/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:26 compute-0 sudo[162206]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:27 compute-0 sudo[162358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozipkrjbpxslvdaujitjzwllywgnnvvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718767.2330997-356-264302414598185/AnsiballZ_lineinfile.py'
Dec 02 23:39:27 compute-0 sudo[162358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:27 compute-0 python3.9[162360]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:27 compute-0 sudo[162358]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:28 compute-0 sudo[162510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mknkemkwnesvbcborszhbvsbysknvrkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718768.1583788-372-104999228312269/AnsiballZ_systemd.py'
Dec 02 23:39:28 compute-0 sudo[162510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:29 compute-0 python3.9[162512]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:39:29 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 02 23:39:29 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 02 23:39:29 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 02 23:39:29 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 02 23:39:29 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 02 23:39:29 compute-0 sudo[162510]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:29 compute-0 sudo[162666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxlracpsuqeskzymzkoospnftfnuetbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718769.4810894-388-193154296546584/AnsiballZ_file.py'
Dec 02 23:39:29 compute-0 sudo[162666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:30 compute-0 python3.9[162668]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:39:30 compute-0 sudo[162666]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:30 compute-0 sudo[162818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xucirmzaqlhqzmxnvqcdhgsrxiffiyux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718770.5367613-406-267196133328366/AnsiballZ_stat.py'
Dec 02 23:39:30 compute-0 sudo[162818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:31 compute-0 python3.9[162820]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:39:31 compute-0 sudo[162818]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:31 compute-0 sudo[162970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njeioxonxhivzpiytocnxirgszchiper ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718771.3809342-424-148348648373227/AnsiballZ_stat.py'
Dec 02 23:39:31 compute-0 sudo[162970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:31 compute-0 python3.9[162972]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:39:31 compute-0 sudo[162970]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:32 compute-0 sudo[163133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-borhgraotrdpyqwfymhuvkadlrecrwek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718772.464538-440-182761769663690/AnsiballZ_stat.py'
Dec 02 23:39:32 compute-0 sudo[163133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:32 compute-0 podman[163096]: 2025-12-02 23:39:32.974505481 +0000 UTC m=+0.193329157 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 02 23:39:33 compute-0 python3.9[163140]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:33 compute-0 sudo[163133]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:33 compute-0 sudo[163271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbtfadsjalrevomigfjhkrwgfxmnyqfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718772.464538-440-182761769663690/AnsiballZ_copy.py'
Dec 02 23:39:33 compute-0 sudo[163271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:33 compute-0 python3.9[163273]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718772.464538-440-182761769663690/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:33 compute-0 sudo[163271]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:34 compute-0 podman[163397]: 2025-12-02 23:39:34.311952364 +0000 UTC m=+0.061012886 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 23:39:34 compute-0 sudo[163440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icdojpzrgerigszudvaubinrlkwakrfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718773.9530492-470-66400070486797/AnsiballZ_command.py'
Dec 02 23:39:34 compute-0 sudo[163440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:34 compute-0 python3.9[163444]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:39:34 compute-0 sudo[163440]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:35 compute-0 sudo[163595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlnpdjiuanhrvvcbnhtwcxylkdaabyxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718775.4817257-486-267796509513828/AnsiballZ_lineinfile.py'
Dec 02 23:39:35 compute-0 sudo[163595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:36 compute-0 python3.9[163597]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:36 compute-0 sudo[163595]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:36 compute-0 sudo[163747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytcfhpezjsuoettypbrvffqpevobazfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718776.2742364-502-1399655150832/AnsiballZ_replace.py'
Dec 02 23:39:36 compute-0 sudo[163747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:37 compute-0 python3.9[163749]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:37 compute-0 sudo[163747]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:37 compute-0 sudo[163899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzyfhcrhkvzvszbuihftsubawjgdzmff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718777.2968714-518-6998236793908/AnsiballZ_replace.py'
Dec 02 23:39:37 compute-0 sudo[163899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:37 compute-0 python3.9[163901]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:37 compute-0 sudo[163899]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:38 compute-0 sudo[164051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qofyeqzfadkhbdvznywufjkynrixofmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718778.286206-536-219886802892564/AnsiballZ_lineinfile.py'
Dec 02 23:39:38 compute-0 sudo[164051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:38 compute-0 python3.9[164053]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:38 compute-0 sudo[164051]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:39 compute-0 sudo[164203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejxnmmofcmbahytlgeuvzxvkhpkkjgve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718779.1291382-536-7402051788352/AnsiballZ_lineinfile.py'
Dec 02 23:39:39 compute-0 sudo[164203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:39 compute-0 python3.9[164205]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:39 compute-0 sudo[164203]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:40 compute-0 sudo[164355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqlwwziqanbahuukxnemqirghuhuggpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718779.8923235-536-39145436322379/AnsiballZ_lineinfile.py'
Dec 02 23:39:40 compute-0 sudo[164355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:40 compute-0 python3.9[164357]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:40 compute-0 sudo[164355]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:41 compute-0 sudo[164507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhnpfrakpnadfsxasimmwtmcrnbvkved ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718780.742989-536-81319847048510/AnsiballZ_lineinfile.py'
Dec 02 23:39:41 compute-0 sudo[164507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:41 compute-0 python3.9[164509]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:41 compute-0 sudo[164507]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:41 compute-0 sudo[164659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkbheizdygihjuphfqrsmhypyltpcmpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718781.569423-594-137774387977619/AnsiballZ_stat.py'
Dec 02 23:39:41 compute-0 sudo[164659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:42 compute-0 python3.9[164661]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:39:42 compute-0 sudo[164659]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:42 compute-0 sudo[164813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozofrdwlmhewltpkphbrqtxnxwbmumsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718782.4361756-610-44538290229651/AnsiballZ_file.py'
Dec 02 23:39:42 compute-0 sudo[164813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:42 compute-0 python3.9[164815]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:42 compute-0 sudo[164813]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:43 compute-0 sudo[164965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkclhebgowtfmlitozxdvicyjkjxntin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718783.3600173-628-273334951357043/AnsiballZ_file.py'
Dec 02 23:39:43 compute-0 sudo[164965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:43 compute-0 python3.9[164967]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:39:43 compute-0 sudo[164965]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:44 compute-0 sudo[165117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pndqgujhxeeaqowqzqykxoakyapwzuwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718784.1838322-644-110900465339800/AnsiballZ_stat.py'
Dec 02 23:39:44 compute-0 sudo[165117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:44 compute-0 python3.9[165119]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:44 compute-0 sudo[165117]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:44 compute-0 sudo[165195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pasrripaltdqcinpysdhxkagtnvuyshp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718784.1838322-644-110900465339800/AnsiballZ_file.py'
Dec 02 23:39:44 compute-0 sudo[165195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:45 compute-0 python3.9[165197]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:39:45 compute-0 sudo[165195]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:45 compute-0 sudo[165347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzxeakunhajxcthzboxjintypmwgbkre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718785.3151872-644-79577523637884/AnsiballZ_stat.py'
Dec 02 23:39:45 compute-0 sudo[165347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:45 compute-0 python3.9[165349]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:45 compute-0 sudo[165347]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:46 compute-0 sudo[165425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxxkjnzhkasxifvrbiuignvkbkxmxcww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718785.3151872-644-79577523637884/AnsiballZ_file.py'
Dec 02 23:39:46 compute-0 sudo[165425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:46 compute-0 python3.9[165427]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:39:46 compute-0 sudo[165425]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:47 compute-0 sudo[165577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lidqokwkdjnansebjlqusrmkanelqzuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718786.9629524-690-8834335265959/AnsiballZ_file.py'
Dec 02 23:39:47 compute-0 sudo[165577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:47 compute-0 python3.9[165579]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:47 compute-0 sudo[165577]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:48 compute-0 sudo[165729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrhmhoggezqrcbkfrdomxwaeblfexkal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718787.8360517-706-187345413708499/AnsiballZ_stat.py'
Dec 02 23:39:48 compute-0 sudo[165729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:48 compute-0 python3.9[165731]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:48 compute-0 sudo[165729]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:48 compute-0 sudo[165807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndtuazbhhyhmovecvzzarsbzcqahfqdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718787.8360517-706-187345413708499/AnsiballZ_file.py'
Dec 02 23:39:48 compute-0 sudo[165807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:48 compute-0 python3.9[165809]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:49 compute-0 sudo[165807]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:49 compute-0 sudo[165959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igbkedmdrxgskzjnzngbjxzhvzvjmwpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718789.5218682-730-76100154842866/AnsiballZ_stat.py'
Dec 02 23:39:49 compute-0 sudo[165959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:50 compute-0 python3.9[165961]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:50 compute-0 sudo[165959]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:50 compute-0 sudo[166037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaoigpstaypagwjijnmagagfragfbkqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718789.5218682-730-76100154842866/AnsiballZ_file.py'
Dec 02 23:39:50 compute-0 sudo[166037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:50 compute-0 python3.9[166039]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:50 compute-0 sudo[166037]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:51 compute-0 sudo[166189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ismkkkngfobyhbosrfyafbzrsfitgibo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718790.9988217-754-153770702151346/AnsiballZ_systemd.py'
Dec 02 23:39:51 compute-0 sudo[166189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:51 compute-0 python3.9[166191]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:39:51 compute-0 systemd[1]: Reloading.
Dec 02 23:39:51 compute-0 systemd-rc-local-generator[166213]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:39:51 compute-0 systemd-sysv-generator[166220]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:39:52 compute-0 sudo[166189]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:52 compute-0 sudo[166378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwhgxyqhukdvcimbyyypqkywfistekev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718792.4464958-770-192598665095743/AnsiballZ_stat.py'
Dec 02 23:39:52 compute-0 sudo[166378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:52 compute-0 python3.9[166380]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:53 compute-0 sudo[166378]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:53 compute-0 sudo[166456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcacyvdxfvoatbikjtgdwkfxuzrsajtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718792.4464958-770-192598665095743/AnsiballZ_file.py'
Dec 02 23:39:53 compute-0 sudo[166456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:53 compute-0 python3.9[166458]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:53 compute-0 sudo[166456]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:54 compute-0 sudo[166608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcfipmutjfdekmbtelgnfuizlzshxksr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718793.7530138-794-96787950669333/AnsiballZ_stat.py'
Dec 02 23:39:54 compute-0 sudo[166608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:54 compute-0 python3.9[166610]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:54 compute-0 sudo[166608]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:54 compute-0 sudo[166686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpsjldlsabnsvvgjlcsvydmvasoxqdbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718793.7530138-794-96787950669333/AnsiballZ_file.py'
Dec 02 23:39:54 compute-0 sudo[166686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:54 compute-0 python3.9[166688]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:39:54 compute-0 sudo[166686]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:55 compute-0 sudo[166838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewsljjlrvlpqnozcgrnpeoondumckeiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718795.3573382-818-214573800592910/AnsiballZ_systemd.py'
Dec 02 23:39:55 compute-0 sudo[166838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:56 compute-0 python3.9[166840]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:39:56 compute-0 systemd[1]: Reloading.
Dec 02 23:39:56 compute-0 systemd-sysv-generator[166870]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:39:56 compute-0 systemd-rc-local-generator[166865]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:39:56 compute-0 systemd[1]: Starting Create netns directory...
Dec 02 23:39:56 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 02 23:39:56 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 02 23:39:56 compute-0 systemd[1]: Finished Create netns directory.
Dec 02 23:39:56 compute-0 sudo[166838]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:57 compute-0 sudo[167030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfepaaodxiutypdllyjgxijauuvrvufy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718796.941207-838-76582128696685/AnsiballZ_file.py'
Dec 02 23:39:57 compute-0 sudo[167030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:57 compute-0 python3.9[167032]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:39:57 compute-0 sudo[167030]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:58 compute-0 sudo[167182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfkooiwjnocxvwckmpgmisqwtppvmddb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718797.8438125-854-32767046268617/AnsiballZ_stat.py'
Dec 02 23:39:58 compute-0 sudo[167182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:58 compute-0 python3.9[167184]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:39:58 compute-0 sudo[167182]: pam_unix(sudo:session): session closed for user root
Dec 02 23:39:58 compute-0 sudo[167305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcuzzjfashjdiuejymqbbiywgwxjodrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718797.8438125-854-32767046268617/AnsiballZ_copy.py'
Dec 02 23:39:58 compute-0 sudo[167305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:39:59 compute-0 python3.9[167307]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718797.8438125-854-32767046268617/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:39:59 compute-0 sudo[167305]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:00 compute-0 sudo[167457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofvmvclocwddthndkcemzemkcrtrgnci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718799.890355-888-245553422310795/AnsiballZ_file.py'
Dec 02 23:40:00 compute-0 sudo[167457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:00 compute-0 python3.9[167459]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:40:00 compute-0 sudo[167457]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:40:00.652 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:40:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:40:00.653 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:40:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:40:00.653 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:40:01 compute-0 sudo[167610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxejtggqeoebwblcqcdgiyaqmxrmmuhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718800.7866435-904-197495776140252/AnsiballZ_stat.py'
Dec 02 23:40:01 compute-0 sudo[167610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:01 compute-0 python3.9[167612]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:40:01 compute-0 sudo[167610]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:01 compute-0 sudo[167733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpjtljmbnquxbqpqlezedvyixazroqcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718800.7866435-904-197495776140252/AnsiballZ_copy.py'
Dec 02 23:40:01 compute-0 sudo[167733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:01 compute-0 python3.9[167735]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718800.7866435-904-197495776140252/.source.json _original_basename=.gknfodkz follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:02 compute-0 sudo[167733]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:02 compute-0 sudo[167885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svdopjpmlreltpcsimhtgidsyznkdrfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718802.464797-934-63561857579523/AnsiballZ_file.py'
Dec 02 23:40:02 compute-0 sudo[167885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:02 compute-0 python3.9[167887]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:02 compute-0 sudo[167885]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:03 compute-0 podman[167912]: 2025-12-02 23:40:03.179748753 +0000 UTC m=+0.130692565 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4)
Dec 02 23:40:03 compute-0 sudo[168064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rixvecwctsbknyoxzxcbidqhmfppcmvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718803.339182-950-226376990511540/AnsiballZ_stat.py'
Dec 02 23:40:03 compute-0 sudo[168064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:03 compute-0 sudo[168064]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:04 compute-0 sudo[168187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dftpsnhedgormifrhbgmswxowfslcasn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718803.339182-950-226376990511540/AnsiballZ_copy.py'
Dec 02 23:40:04 compute-0 sudo[168187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:04 compute-0 podman[168189]: 2025-12-02 23:40:04.417479077 +0000 UTC m=+0.058961421 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:40:04 compute-0 sudo[168187]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:05 compute-0 sudo[168359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpdjqfrgyjiopknlegqnshxvzmhlqrnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718805.2624667-984-145757395890565/AnsiballZ_container_config_data.py'
Dec 02 23:40:05 compute-0 sudo[168359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:06 compute-0 python3.9[168361]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 02 23:40:06 compute-0 sudo[168359]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:06 compute-0 sudo[168511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcjzqtnswjwtttypicfokqwmwxdsevgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718806.4418592-1002-170909586679396/AnsiballZ_container_config_hash.py'
Dec 02 23:40:06 compute-0 sudo[168511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:07 compute-0 python3.9[168513]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:40:07 compute-0 sudo[168511]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:08 compute-0 sudo[168665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlrriuhjuotztvleuqoevktmxryzruxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718807.5737612-1020-35773696145619/AnsiballZ_podman_container_info.py'
Dec 02 23:40:08 compute-0 sudo[168665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:08 compute-0 python3.9[168667]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 02 23:40:08 compute-0 sudo[168665]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:08 compute-0 sshd-session[168538]: Invalid user bitnami from 45.78.218.154 port 33282
Dec 02 23:40:09 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 02 23:40:10 compute-0 sudo[168845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjolzxfommhkotsnwzecbguvgkownxcf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718809.4275923-1046-132341383508638/AnsiballZ_edpm_container_manage.py'
Dec 02 23:40:10 compute-0 sudo[168845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:10 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 02 23:40:10 compute-0 python3[168847]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:40:10 compute-0 podman[168885]: 2025-12-02 23:40:10.556605589 +0000 UTC m=+0.052259436 container create a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:40:10 compute-0 podman[168885]: 2025-12-02 23:40:10.529633336 +0000 UTC m=+0.025287193 image pull 13a8acc03c3934b75192e1b3a8c127f56bf115253a854621e8e0e8b6330d5e9b 38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Dec 02 23:40:10 compute-0 python3[168847]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z 38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Dec 02 23:40:10 compute-0 sudo[168845]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:11 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Dec 02 23:40:11 compute-0 sudo[169074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-resxgyhzimfhajnxdiievfyjrzarewwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718811.1168044-1062-176307438157890/AnsiballZ_stat.py'
Dec 02 23:40:11 compute-0 sudo[169074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:11 compute-0 python3.9[169076]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:40:11 compute-0 sudo[169074]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:12 compute-0 sudo[169228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqmnfjvcakbmxffiwyglljhydagraojr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718812.1718493-1080-126446684320658/AnsiballZ_file.py'
Dec 02 23:40:12 compute-0 sudo[169228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:12 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 02 23:40:12 compute-0 python3.9[169230]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:12 compute-0 sudo[169228]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:13 compute-0 sudo[169305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwoygtxnsnxjssfqnivhrogqppnumtlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718812.1718493-1080-126446684320658/AnsiballZ_stat.py'
Dec 02 23:40:13 compute-0 sudo[169305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:13 compute-0 python3.9[169307]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:40:13 compute-0 sudo[169305]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:13 compute-0 sudo[169456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxestsmetbgltablxnabiflauzerqkiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718813.4814305-1080-84617899705900/AnsiballZ_copy.py'
Dec 02 23:40:13 compute-0 sudo[169456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:14 compute-0 python3.9[169458]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764718813.4814305-1080-84617899705900/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:14 compute-0 sudo[169456]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:14 compute-0 sudo[169534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oskzxpjsimgrsukmsjixlgvzznqneeex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718813.4814305-1080-84617899705900/AnsiballZ_systemd.py'
Dec 02 23:40:14 compute-0 sudo[169534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:14 compute-0 python3.9[169536]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:40:14 compute-0 systemd[1]: Reloading.
Dec 02 23:40:14 compute-0 systemd-sysv-generator[169566]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:40:14 compute-0 systemd-rc-local-generator[169563]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:40:15 compute-0 sudo[169534]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:15 compute-0 sudo[169644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qolqagxogfhtdursfcwjazsisbusctes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718813.4814305-1080-84617899705900/AnsiballZ_systemd.py'
Dec 02 23:40:15 compute-0 sudo[169644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:15 compute-0 python3.9[169646]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:15 compute-0 systemd[1]: Reloading.
Dec 02 23:40:16 compute-0 systemd-sysv-generator[169679]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:40:16 compute-0 systemd-rc-local-generator[169676]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:40:16 compute-0 systemd[1]: Starting multipathd container...
Dec 02 23:40:16 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:40:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5194ee09d9b97aca0e4609763a0d7a9eabdd8a1068b6005b666638f2fc5ae894/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 02 23:40:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5194ee09d9b97aca0e4609763a0d7a9eabdd8a1068b6005b666638f2fc5ae894/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 23:40:16 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a.
Dec 02 23:40:16 compute-0 podman[169686]: 2025-12-02 23:40:16.394522465 +0000 UTC m=+0.117018128 container init a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 23:40:16 compute-0 multipathd[169700]: + sudo -E kolla_set_configs
Dec 02 23:40:16 compute-0 podman[169686]: 2025-12-02 23:40:16.429201108 +0000 UTC m=+0.151696711 container start a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 23:40:16 compute-0 podman[169686]: multipathd
Dec 02 23:40:16 compute-0 sudo[169708]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 23:40:16 compute-0 systemd[1]: Started multipathd container.
Dec 02 23:40:16 compute-0 sudo[169708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 23:40:16 compute-0 sudo[169644]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:16 compute-0 multipathd[169700]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 23:40:16 compute-0 multipathd[169700]: INFO:__main__:Validating config file
Dec 02 23:40:16 compute-0 multipathd[169700]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 23:40:16 compute-0 multipathd[169700]: INFO:__main__:Writing out command to execute
Dec 02 23:40:16 compute-0 sudo[169708]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:16 compute-0 multipathd[169700]: ++ cat /run_command
Dec 02 23:40:16 compute-0 multipathd[169700]: + CMD='/usr/sbin/multipathd -d'
Dec 02 23:40:16 compute-0 multipathd[169700]: + ARGS=
Dec 02 23:40:16 compute-0 multipathd[169700]: + sudo kolla_copy_cacerts
Dec 02 23:40:16 compute-0 sudo[169732]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 02 23:40:16 compute-0 sudo[169732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 23:40:16 compute-0 sudo[169732]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:16 compute-0 multipathd[169700]: + [[ ! -n '' ]]
Dec 02 23:40:16 compute-0 multipathd[169700]: + . kolla_extend_start
Dec 02 23:40:16 compute-0 multipathd[169700]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 02 23:40:16 compute-0 multipathd[169700]: Running command: '/usr/sbin/multipathd -d'
Dec 02 23:40:16 compute-0 multipathd[169700]: + umask 0022
Dec 02 23:40:16 compute-0 multipathd[169700]: + exec /usr/sbin/multipathd -d
Dec 02 23:40:16 compute-0 podman[169709]: 2025-12-02 23:40:16.539709195 +0000 UTC m=+0.097304763 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:40:16 compute-0 systemd[1]: a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a-41900decfae35148.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 23:40:16 compute-0 multipathd[169700]: 2907.254927 | multipathd v0.9.9: start up
Dec 02 23:40:16 compute-0 systemd[1]: a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a-41900decfae35148.service: Failed with result 'exit-code'.
Dec 02 23:40:16 compute-0 multipathd[169700]: 2907.261538 | reconfigure: setting up paths and maps
Dec 02 23:40:16 compute-0 multipathd[169700]: 2907.264489 | _check_bindings_file: failed to read header from /etc/multipath/bindings
Dec 02 23:40:16 compute-0 multipathd[169700]: 2907.266214 | updated bindings file /etc/multipath/bindings
Dec 02 23:40:17 compute-0 sshd-session[169459]: Invalid user fiscal from 45.78.219.213 port 59052
Dec 02 23:40:17 compute-0 python3.9[169891]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:40:17 compute-0 sshd-session[169459]: Received disconnect from 45.78.219.213 port 59052:11: Bye Bye [preauth]
Dec 02 23:40:17 compute-0 sshd-session[169459]: Disconnected from invalid user fiscal 45.78.219.213 port 59052 [preauth]
Dec 02 23:40:18 compute-0 sudo[170043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioumpuvdumesdqmmdeeitxirlbkbshcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718817.8797214-1152-251784733570115/AnsiballZ_command.py'
Dec 02 23:40:18 compute-0 sudo[170043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:18 compute-0 python3.9[170045]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:40:18 compute-0 sudo[170043]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:19 compute-0 sudo[170210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsihkeytwrodbjeimjglmsnrsbrcokvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718818.8400307-1168-123869412699313/AnsiballZ_systemd.py'
Dec 02 23:40:19 compute-0 sudo[170210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:19 compute-0 python3.9[170212]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:40:19 compute-0 systemd[1]: Stopping multipathd container...
Dec 02 23:40:19 compute-0 sshd-session[170059]: Invalid user username from 49.247.36.49 port 51719
Dec 02 23:40:19 compute-0 multipathd[169700]: 2910.336794 | multipathd: shut down
Dec 02 23:40:19 compute-0 systemd[1]: libpod-a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a.scope: Deactivated successfully.
Dec 02 23:40:19 compute-0 podman[170216]: 2025-12-02 23:40:19.659795473 +0000 UTC m=+0.076199574 container died a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:40:19 compute-0 systemd[1]: a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a-41900decfae35148.timer: Deactivated successfully.
Dec 02 23:40:19 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a.
Dec 02 23:40:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a-userdata-shm.mount: Deactivated successfully.
Dec 02 23:40:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-5194ee09d9b97aca0e4609763a0d7a9eabdd8a1068b6005b666638f2fc5ae894-merged.mount: Deactivated successfully.
Dec 02 23:40:19 compute-0 podman[170216]: 2025-12-02 23:40:19.719670726 +0000 UTC m=+0.136074817 container cleanup a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 02 23:40:19 compute-0 podman[170216]: multipathd
Dec 02 23:40:19 compute-0 podman[170241]: multipathd
Dec 02 23:40:19 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 02 23:40:19 compute-0 systemd[1]: Stopped multipathd container.
Dec 02 23:40:19 compute-0 sshd-session[170059]: Received disconnect from 49.247.36.49 port 51719:11: Bye Bye [preauth]
Dec 02 23:40:19 compute-0 sshd-session[170059]: Disconnected from invalid user username 49.247.36.49 port 51719 [preauth]
Dec 02 23:40:19 compute-0 systemd[1]: Starting multipathd container...
Dec 02 23:40:19 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:40:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5194ee09d9b97aca0e4609763a0d7a9eabdd8a1068b6005b666638f2fc5ae894/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 02 23:40:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5194ee09d9b97aca0e4609763a0d7a9eabdd8a1068b6005b666638f2fc5ae894/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 23:40:19 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a.
Dec 02 23:40:19 compute-0 podman[170254]: 2025-12-02 23:40:19.942339131 +0000 UTC m=+0.116619919 container init a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Dec 02 23:40:19 compute-0 multipathd[170269]: + sudo -E kolla_set_configs
Dec 02 23:40:19 compute-0 podman[170254]: 2025-12-02 23:40:19.968142945 +0000 UTC m=+0.142423733 container start a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd)
Dec 02 23:40:19 compute-0 sudo[170275]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 02 23:40:19 compute-0 podman[170254]: multipathd
Dec 02 23:40:19 compute-0 sudo[170275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 23:40:19 compute-0 systemd[1]: Started multipathd container.
Dec 02 23:40:20 compute-0 sudo[170210]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:20 compute-0 podman[170276]: 2025-12-02 23:40:20.041688584 +0000 UTC m=+0.061648357 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:40:20 compute-0 multipathd[170269]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 23:40:20 compute-0 multipathd[170269]: INFO:__main__:Validating config file
Dec 02 23:40:20 compute-0 multipathd[170269]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 23:40:20 compute-0 multipathd[170269]: INFO:__main__:Writing out command to execute
Dec 02 23:40:20 compute-0 systemd[1]: a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a-20d61801dd460f77.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 23:40:20 compute-0 systemd[1]: a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a-20d61801dd460f77.service: Failed with result 'exit-code'.
Dec 02 23:40:20 compute-0 sudo[170275]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:20 compute-0 multipathd[170269]: ++ cat /run_command
Dec 02 23:40:20 compute-0 multipathd[170269]: + CMD='/usr/sbin/multipathd -d'
Dec 02 23:40:20 compute-0 multipathd[170269]: + ARGS=
Dec 02 23:40:20 compute-0 multipathd[170269]: + sudo kolla_copy_cacerts
Dec 02 23:40:20 compute-0 sudo[170304]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 02 23:40:20 compute-0 sudo[170304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 02 23:40:20 compute-0 sudo[170304]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:20 compute-0 multipathd[170269]: + [[ ! -n '' ]]
Dec 02 23:40:20 compute-0 multipathd[170269]: + . kolla_extend_start
Dec 02 23:40:20 compute-0 multipathd[170269]: Running command: '/usr/sbin/multipathd -d'
Dec 02 23:40:20 compute-0 multipathd[170269]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 02 23:40:20 compute-0 multipathd[170269]: + umask 0022
Dec 02 23:40:20 compute-0 multipathd[170269]: + exec /usr/sbin/multipathd -d
Dec 02 23:40:20 compute-0 multipathd[170269]: 2910.799351 | multipathd v0.9.9: start up
Dec 02 23:40:20 compute-0 multipathd[170269]: 2910.807144 | reconfigure: setting up paths and maps
Dec 02 23:40:20 compute-0 sudo[170455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smjfgzgtqbieagdcvbhnrxphlfybsmzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718820.5007222-1184-9046253431311/AnsiballZ_file.py'
Dec 02 23:40:20 compute-0 sudo[170455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:20 compute-0 python3.9[170457]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:20 compute-0 sudo[170455]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:22 compute-0 sudo[170607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kinlaokknauprtlpidswkyrqvmjxlfdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718821.6573489-1208-53531981678438/AnsiballZ_file.py'
Dec 02 23:40:22 compute-0 sudo[170607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:22 compute-0 python3.9[170609]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 02 23:40:22 compute-0 sudo[170607]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:22 compute-0 sudo[170759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnhgyyeznnejgrsndhwduqeizhxbkegb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718822.514944-1224-166303843268943/AnsiballZ_modprobe.py'
Dec 02 23:40:22 compute-0 sudo[170759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:23 compute-0 python3.9[170761]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 02 23:40:23 compute-0 kernel: Key type psk registered
Dec 02 23:40:23 compute-0 sudo[170759]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:23 compute-0 sudo[170922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiiikxaetqnfydaqmeyhovjxkgxisvtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718823.5293543-1240-262225996458479/AnsiballZ_stat.py'
Dec 02 23:40:23 compute-0 sudo[170922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:24 compute-0 python3.9[170924]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:40:24 compute-0 sudo[170922]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:24 compute-0 sudo[171045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncppgtjyqtrpwfbkzaxssmnjoifovahv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718823.5293543-1240-262225996458479/AnsiballZ_copy.py'
Dec 02 23:40:24 compute-0 sudo[171045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:24 compute-0 python3.9[171047]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718823.5293543-1240-262225996458479/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:24 compute-0 sudo[171045]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:25 compute-0 sudo[171197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjohhwxtgcjodsqyjamoapnxtjpioavl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718825.1339412-1272-52405720151790/AnsiballZ_lineinfile.py'
Dec 02 23:40:25 compute-0 sudo[171197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:25 compute-0 python3.9[171199]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:25 compute-0 sudo[171197]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:26 compute-0 sudo[171349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgsavajxrvuaauvwxndmaiwxeymlsgvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718826.064408-1288-37458165072471/AnsiballZ_systemd.py'
Dec 02 23:40:26 compute-0 sudo[171349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:26 compute-0 python3.9[171351]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:40:26 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 02 23:40:26 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 02 23:40:26 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 02 23:40:26 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 02 23:40:26 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 02 23:40:26 compute-0 sudo[171349]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:28 compute-0 sudo[171505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vamlohtvzncozqrelrinzberbzqpisqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718828.289677-1304-106662102961961/AnsiballZ_dnf.py'
Dec 02 23:40:28 compute-0 sudo[171505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:28 compute-0 python3.9[171507]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 02 23:40:31 compute-0 systemd[1]: Reloading.
Dec 02 23:40:31 compute-0 systemd-rc-local-generator[171540]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:40:31 compute-0 systemd-sysv-generator[171545]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:40:31 compute-0 systemd[1]: Reloading.
Dec 02 23:40:31 compute-0 systemd-rc-local-generator[171575]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:40:31 compute-0 systemd-sysv-generator[171579]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:40:31 compute-0 systemd-logind[795]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 02 23:40:31 compute-0 systemd-logind[795]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 02 23:40:32 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 02 23:40:32 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 02 23:40:32 compute-0 systemd[1]: Reloading.
Dec 02 23:40:32 compute-0 systemd-rc-local-generator[171674]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:40:32 compute-0 systemd-sysv-generator[171678]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:40:32 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 02 23:40:32 compute-0 sudo[171505]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:33 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 02 23:40:33 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 02 23:40:33 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.859s CPU time.
Dec 02 23:40:33 compute-0 systemd[1]: run-r526a1103a788409e8efd7b1db9fb128c.service: Deactivated successfully.
Dec 02 23:40:33 compute-0 sudo[172981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqlktcranimddwvactmpywjhnrafdiwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718833.5021677-1320-208678204060388/AnsiballZ_systemd_service.py'
Dec 02 23:40:33 compute-0 sudo[172981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:33 compute-0 podman[172935]: 2025-12-02 23:40:33.916695971 +0000 UTC m=+0.168864933 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 02 23:40:34 compute-0 python3.9[172985]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:40:34 compute-0 systemd[1]: Stopping Open-iSCSI...
Dec 02 23:40:34 compute-0 iscsid[161327]: iscsid shutting down.
Dec 02 23:40:34 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Dec 02 23:40:34 compute-0 systemd[1]: Stopped Open-iSCSI.
Dec 02 23:40:34 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 02 23:40:34 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 02 23:40:34 compute-0 systemd[1]: Started Open-iSCSI.
Dec 02 23:40:34 compute-0 sudo[172981]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:35 compute-0 podman[173118]: 2025-12-02 23:40:35.091274413 +0000 UTC m=+0.107334311 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:40:35 compute-0 python3.9[173153]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:40:36 compute-0 sudo[173317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiwzfcfpehkasjqitaqhlhtiasexpqeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718836.006058-1355-112690624839836/AnsiballZ_file.py'
Dec 02 23:40:36 compute-0 sudo[173317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:36 compute-0 python3.9[173319]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:36 compute-0 sudo[173317]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:37 compute-0 sudo[173469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wicekmidltpyhzzjrlhvvwexasuysupc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718837.1932158-1377-20766091550243/AnsiballZ_systemd_service.py'
Dec 02 23:40:37 compute-0 sudo[173469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:37 compute-0 python3.9[173471]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:40:37 compute-0 systemd[1]: Reloading.
Dec 02 23:40:37 compute-0 systemd-rc-local-generator[173497]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:40:37 compute-0 systemd-sysv-generator[173501]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:40:38 compute-0 sudo[173469]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:38 compute-0 python3.9[173655]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:40:39 compute-0 network[173672]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:40:39 compute-0 network[173673]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:40:39 compute-0 network[173674]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:40:44 compute-0 sshd-session[168538]: Received disconnect from 45.78.218.154 port 33282:11: Bye Bye [preauth]
Dec 02 23:40:44 compute-0 sshd-session[168538]: Disconnected from invalid user bitnami 45.78.218.154 port 33282 [preauth]
Dec 02 23:40:48 compute-0 sudo[173946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbtszbfvocsxerubnwptvplkrzzdqchy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718847.6110075-1415-94345417297081/AnsiballZ_systemd_service.py'
Dec 02 23:40:48 compute-0 sudo[173946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:48 compute-0 python3.9[173948]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:48 compute-0 sudo[173946]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:49 compute-0 sudo[174099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmdfwqbazmmpqzbnxfwrvnhntiqeuvfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718848.819005-1415-155444258834114/AnsiballZ_systemd_service.py'
Dec 02 23:40:49 compute-0 sudo[174099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:49 compute-0 python3.9[174101]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:49 compute-0 sudo[174099]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:50 compute-0 sudo[174262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgpcxlgoitzjdlcmztthxakootmgvgnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718849.8148422-1415-160734751612254/AnsiballZ_systemd_service.py'
Dec 02 23:40:50 compute-0 sudo[174262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:50 compute-0 podman[174226]: 2025-12-02 23:40:50.198482041 +0000 UTC m=+0.085954745 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Dec 02 23:40:50 compute-0 python3.9[174271]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:50 compute-0 sudo[174262]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:50 compute-0 sudo[174425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrenxzgyvrpgxqpxsgiyuerrkmhltmcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718850.655729-1415-38498225893495/AnsiballZ_systemd_service.py'
Dec 02 23:40:50 compute-0 sudo[174425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:51 compute-0 python3.9[174427]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:51 compute-0 sudo[174425]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:51 compute-0 sudo[174578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whnmgnqepbxcmbnfnlxhegscnsbzpggp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718851.4527626-1415-225020586714244/AnsiballZ_systemd_service.py'
Dec 02 23:40:51 compute-0 sudo[174578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:52 compute-0 python3.9[174580]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:52 compute-0 sudo[174578]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:52 compute-0 sudo[174731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hypphkcduoretjoabjqqmzjbhysgllxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718852.3184986-1415-222983342702178/AnsiballZ_systemd_service.py'
Dec 02 23:40:52 compute-0 sudo[174731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:52 compute-0 python3.9[174733]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:52 compute-0 sudo[174731]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:53 compute-0 sudo[174884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttamatlgkggnlbyaqjarjwbdozdstbgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718853.1115978-1415-144582675135499/AnsiballZ_systemd_service.py'
Dec 02 23:40:53 compute-0 sudo[174884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:53 compute-0 python3.9[174886]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:53 compute-0 sudo[174884]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:54 compute-0 sudo[175037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caftmckjrkunjegiykvjigggslipgcef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718853.979153-1415-68504830882353/AnsiballZ_systemd_service.py'
Dec 02 23:40:54 compute-0 sudo[175037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:54 compute-0 python3.9[175039]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:40:54 compute-0 sudo[175037]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:56 compute-0 sudo[175190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plowisdcrhdvdhjvnavprvdytofzjobz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718855.772021-1533-213559456483189/AnsiballZ_file.py'
Dec 02 23:40:56 compute-0 sudo[175190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:56 compute-0 python3.9[175192]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:56 compute-0 sudo[175190]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:56 compute-0 sudo[175342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuyoktquywelpiiqqvdffklmpkicnnrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718856.5056357-1533-12915278285054/AnsiballZ_file.py'
Dec 02 23:40:56 compute-0 sudo[175342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:57 compute-0 python3.9[175344]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:57 compute-0 sudo[175342]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:57 compute-0 sudo[175494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nusstedtspnpyzrhiqsvhifdxadsrerk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718857.2652338-1533-211571774114248/AnsiballZ_file.py'
Dec 02 23:40:57 compute-0 sudo[175494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:57 compute-0 python3.9[175496]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:57 compute-0 sudo[175494]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:58 compute-0 sudo[175646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtzkaudakutiboqupgqxccevdxpihzsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718857.9509525-1533-157756950487306/AnsiballZ_file.py'
Dec 02 23:40:58 compute-0 sudo[175646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:58 compute-0 python3.9[175648]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:58 compute-0 sudo[175646]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:58 compute-0 sudo[175798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sejbajsckmgczpsqbibmfgotbgykfdoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718858.6011515-1533-126027725835640/AnsiballZ_file.py'
Dec 02 23:40:58 compute-0 sudo[175798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:40:59 compute-0 python3.9[175800]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:40:59 compute-0 sudo[175798]: pam_unix(sudo:session): session closed for user root
Dec 02 23:40:59 compute-0 sudo[175950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ongwvtpkeksspicjyshpuaejhhltucte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718859.400741-1533-16646926922985/AnsiballZ_file.py'
Dec 02 23:40:59 compute-0 sudo[175950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:00 compute-0 python3.9[175952]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:00 compute-0 sudo[175950]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:41:00.654 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:41:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:41:00.655 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:41:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:41:00.655 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:41:00 compute-0 sudo[176102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stylsnwjpirehaqgemrephwkxpqzdhnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718860.2490704-1533-86843584150511/AnsiballZ_file.py'
Dec 02 23:41:00 compute-0 sudo[176102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:00 compute-0 python3.9[176105]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:00 compute-0 sudo[176102]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:01 compute-0 sudo[176255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niermssapfcvwjliizzsswvpojgagsne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718861.0569594-1533-146342590798924/AnsiballZ_file.py'
Dec 02 23:41:01 compute-0 sudo[176255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:01 compute-0 python3.9[176257]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:01 compute-0 sudo[176255]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:02 compute-0 sudo[176407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulyzllsjmoqddjtefehlcbemqkxnlodx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718862.1993635-1647-160492756530377/AnsiballZ_file.py'
Dec 02 23:41:02 compute-0 sudo[176407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:02 compute-0 python3.9[176409]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:02 compute-0 sudo[176407]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:03 compute-0 sudo[176559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiqlioxhqhudiizriesksjoytbngajqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718862.8557413-1647-109286263719410/AnsiballZ_file.py'
Dec 02 23:41:03 compute-0 sudo[176559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:03 compute-0 python3.9[176561]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:03 compute-0 sudo[176559]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:03 compute-0 sudo[176711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoqvgdpuzkaeidkdazlthnsklzxtgxog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718863.6214411-1647-201682666927605/AnsiballZ_file.py'
Dec 02 23:41:03 compute-0 sudo[176711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:04 compute-0 podman[176713]: 2025-12-02 23:41:04.137910373 +0000 UTC m=+0.132180912 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, config_id=ovn_controller)
Dec 02 23:41:04 compute-0 python3.9[176714]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:04 compute-0 sudo[176711]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:04 compute-0 sudo[176890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngypjsofpebliykcmofjmxvvoiwrryxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718864.3509243-1647-36264870248825/AnsiballZ_file.py'
Dec 02 23:41:04 compute-0 sudo[176890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:04 compute-0 python3.9[176892]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:04 compute-0 sudo[176890]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:05 compute-0 sudo[177050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkrpetpertgeebzpyrwdnbmmlwjercsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718864.9791162-1647-18831057105102/AnsiballZ_file.py'
Dec 02 23:41:05 compute-0 sudo[177050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:05 compute-0 podman[177016]: 2025-12-02 23:41:05.328013406 +0000 UTC m=+0.072588553 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 02 23:41:05 compute-0 python3.9[177055]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:05 compute-0 sudo[177050]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:06 compute-0 sudo[177213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfqdrutsavpifmcbfqbyvcxazdgdfyms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718865.6681857-1647-16613043794697/AnsiballZ_file.py'
Dec 02 23:41:06 compute-0 sudo[177213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:06 compute-0 python3.9[177215]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:06 compute-0 sudo[177213]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:07 compute-0 sudo[177365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-favkgaujtehxaqsuwccvbwccddhvmcab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718866.6226883-1647-155258750497098/AnsiballZ_file.py'
Dec 02 23:41:07 compute-0 sudo[177365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:07 compute-0 python3.9[177367]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:07 compute-0 sudo[177365]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:07 compute-0 sudo[177517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilkugstvkjmslhjjlygfmkmwjokwscor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718867.4148688-1647-157463131737861/AnsiballZ_file.py'
Dec 02 23:41:07 compute-0 sudo[177517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:07 compute-0 python3.9[177519]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:07 compute-0 sudo[177517]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:09 compute-0 sudo[177669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lixpbgnntftxhzjeuzfkoohoifcrffsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718868.8855624-1763-76232232181015/AnsiballZ_command.py'
Dec 02 23:41:09 compute-0 sudo[177669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:09 compute-0 python3.9[177671]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:09 compute-0 sudo[177669]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:10 compute-0 python3.9[177823]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 23:41:11 compute-0 sudo[177973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-golwhrfojbrlfsbvolqnwlrzulqzbgos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718870.9351203-1799-42090205655060/AnsiballZ_systemd_service.py'
Dec 02 23:41:11 compute-0 sudo[177973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:11 compute-0 python3.9[177975]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:41:11 compute-0 systemd[1]: Reloading.
Dec 02 23:41:11 compute-0 systemd-rc-local-generator[177997]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:41:11 compute-0 systemd-sysv-generator[178001]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:41:11 compute-0 sudo[177973]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:12 compute-0 sudo[178161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faklenuwgbambquwlnjmvacpqvhfgtsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718872.3303938-1815-251223054309729/AnsiballZ_command.py'
Dec 02 23:41:12 compute-0 sudo[178161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:12 compute-0 python3.9[178163]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:12 compute-0 sudo[178161]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:13 compute-0 sudo[178314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrzpxfoqmzdiuunxiftkdbynbqvwrcfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718873.1305556-1815-174402031588250/AnsiballZ_command.py'
Dec 02 23:41:13 compute-0 sudo[178314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:13 compute-0 python3.9[178316]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:13 compute-0 sudo[178314]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:14 compute-0 sudo[178467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohhtzznpaumtrmyrhsfqgikaqduvpkek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718873.8647377-1815-252851118580949/AnsiballZ_command.py'
Dec 02 23:41:14 compute-0 sudo[178467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:14 compute-0 python3.9[178469]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:14 compute-0 sudo[178467]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:14 compute-0 sudo[178620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smyunqcfyuqnkzzlhcpaovylqfddwqyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718874.622116-1815-153806656106763/AnsiballZ_command.py'
Dec 02 23:41:14 compute-0 sudo[178620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:15 compute-0 python3.9[178622]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:15 compute-0 sudo[178620]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:15 compute-0 sudo[178773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cggihmwahajcgsomydighhcfayjmchig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718875.3915625-1815-278504388401925/AnsiballZ_command.py'
Dec 02 23:41:15 compute-0 sudo[178773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:15 compute-0 python3.9[178775]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:15 compute-0 sudo[178773]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:16 compute-0 sudo[178926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjhyxzwwpxlykopgvojsxyxbggvkyaqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718876.2085516-1815-154814133217694/AnsiballZ_command.py'
Dec 02 23:41:16 compute-0 sudo[178926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:16 compute-0 python3.9[178928]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:16 compute-0 sudo[178926]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:17 compute-0 sudo[179079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umlqynqmizjufmnetwuxwfohnvkyteli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718876.9677153-1815-78584212599822/AnsiballZ_command.py'
Dec 02 23:41:17 compute-0 sudo[179079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:17 compute-0 python3.9[179081]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:17 compute-0 sudo[179079]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:18 compute-0 sudo[179232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vskdbdsrmhpuzsydwhxdzyvpwyqtgzxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718877.848057-1815-259471440308354/AnsiballZ_command.py'
Dec 02 23:41:18 compute-0 sudo[179232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:18 compute-0 python3.9[179234]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:41:18 compute-0 sudo[179232]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:20 compute-0 sudo[179398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lchmkbqkplknzffpyssbyxzfhpkpikfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718879.9917235-1958-92281594205341/AnsiballZ_file.py'
Dec 02 23:41:20 compute-0 sudo[179398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:20 compute-0 podman[179359]: 2025-12-02 23:41:20.41690871 +0000 UTC m=+0.077431670 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 02 23:41:20 compute-0 python3.9[179407]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:20 compute-0 sudo[179398]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:21 compute-0 sudo[179558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btkdxvjtirvfzrxcsvrvmpwcqixqupkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718880.8038535-1958-126504178668531/AnsiballZ_file.py'
Dec 02 23:41:21 compute-0 sudo[179558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:21 compute-0 python3.9[179560]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:21 compute-0 sudo[179558]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:22 compute-0 sudo[179710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyumkcpcipgaaptrvaenrmmtmikskgfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718881.5353827-1958-120196005114077/AnsiballZ_file.py'
Dec 02 23:41:22 compute-0 sudo[179710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:22 compute-0 python3.9[179712]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:22 compute-0 sudo[179710]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:23 compute-0 sudo[179862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoqvvcawvovghwzvscxmlcmvyainiexb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718882.692418-2002-249758303629742/AnsiballZ_file.py'
Dec 02 23:41:23 compute-0 sudo[179862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:23 compute-0 python3.9[179864]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:23 compute-0 sudo[179862]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:23 compute-0 sudo[180014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zogfjebwknhjcobiweqtbewtrdlfdnxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718883.4161184-2002-140011901825585/AnsiballZ_file.py'
Dec 02 23:41:23 compute-0 sudo[180014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:23 compute-0 python3.9[180016]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:23 compute-0 sudo[180014]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:24 compute-0 sudo[180166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuvmskgozgqgofvsqfbnewbriniuhvdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718884.0684826-2002-74858041164722/AnsiballZ_file.py'
Dec 02 23:41:24 compute-0 sudo[180166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:24 compute-0 python3.9[180168]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:24 compute-0 sudo[180166]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:25 compute-0 sudo[180318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sefatykqcgqgigmuvvtuxaknjvjginhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718884.7495735-2002-201001185688031/AnsiballZ_file.py'
Dec 02 23:41:25 compute-0 sudo[180318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:25 compute-0 python3.9[180320]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:25 compute-0 sudo[180318]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:25 compute-0 sudo[180470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pifjrnjaeclpbbikverpbxcyyjkhmkun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718885.4080255-2002-111301077253284/AnsiballZ_file.py'
Dec 02 23:41:25 compute-0 sudo[180470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:25 compute-0 python3.9[180472]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:25 compute-0 sudo[180470]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:26 compute-0 sudo[180622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrbqshrmdofcsdvdhyitxjstxkjeoayl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718886.1632323-2002-177465401745961/AnsiballZ_file.py'
Dec 02 23:41:26 compute-0 sudo[180622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:26 compute-0 python3.9[180624]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:26 compute-0 sudo[180622]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:27 compute-0 sudo[180774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjmgjtuekqkbpmospsyiumagyqiwshxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718886.8656108-2002-226613980527505/AnsiballZ_file.py'
Dec 02 23:41:27 compute-0 sudo[180774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:27 compute-0 python3.9[180776]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:27 compute-0 sudo[180774]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:33 compute-0 sudo[180926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqnzbdneihqxulyhosrmndtcnmznxuew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718893.0388508-2239-201586818819433/AnsiballZ_getent.py'
Dec 02 23:41:33 compute-0 sudo[180926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:33 compute-0 python3.9[180928]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 02 23:41:33 compute-0 sudo[180926]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:34 compute-0 sudo[181094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msrquarrvxcbkdnqemmpstrsydakwyah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718893.9228904-2255-73912195947034/AnsiballZ_group.py'
Dec 02 23:41:34 compute-0 sudo[181094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:34 compute-0 podman[181053]: 2025-12-02 23:41:34.477394031 +0000 UTC m=+0.100798965 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 02 23:41:34 compute-0 python3.9[181098]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 23:41:34 compute-0 groupadd[181109]: group added to /etc/group: name=nova, GID=42436
Dec 02 23:41:34 compute-0 groupadd[181109]: group added to /etc/gshadow: name=nova
Dec 02 23:41:34 compute-0 groupadd[181109]: new group: name=nova, GID=42436
Dec 02 23:41:34 compute-0 sudo[181094]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:35 compute-0 podman[181238]: 2025-12-02 23:41:35.676016448 +0000 UTC m=+0.056647613 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 23:41:35 compute-0 sudo[181283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtqeflqbqivrtjfusivmvnizwmdpwidp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718895.0498145-2271-242174308163000/AnsiballZ_user.py'
Dec 02 23:41:35 compute-0 sudo[181283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:35 compute-0 python3.9[181285]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 02 23:41:35 compute-0 useradd[181287]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Dec 02 23:41:35 compute-0 useradd[181287]: add 'nova' to group 'libvirt'
Dec 02 23:41:35 compute-0 useradd[181287]: add 'nova' to shadow group 'libvirt'
Dec 02 23:41:36 compute-0 sudo[181283]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:37 compute-0 sshd-session[181318]: Accepted publickey for zuul from 192.168.122.30 port 38452 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:41:37 compute-0 systemd-logind[795]: New session 25 of user zuul.
Dec 02 23:41:37 compute-0 systemd[1]: Started Session 25 of User zuul.
Dec 02 23:41:37 compute-0 sshd-session[181318]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:41:37 compute-0 sshd-session[181321]: Received disconnect from 192.168.122.30 port 38452:11: disconnected by user
Dec 02 23:41:37 compute-0 sshd-session[181321]: Disconnected from user zuul 192.168.122.30 port 38452
Dec 02 23:41:37 compute-0 sshd-session[181318]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:41:37 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Dec 02 23:41:37 compute-0 systemd-logind[795]: Session 25 logged out. Waiting for processes to exit.
Dec 02 23:41:37 compute-0 systemd-logind[795]: Removed session 25.
Dec 02 23:41:38 compute-0 python3.9[181471]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:38 compute-0 python3.9[181592]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718897.491349-2321-185000621266114/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:39 compute-0 python3.9[181742]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:39 compute-0 python3.9[181818]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:40 compute-0 python3.9[181968]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:40 compute-0 python3.9[182089]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718899.8727634-2321-257176244984510/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:41 compute-0 python3.9[182239]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:42 compute-0 python3.9[182360]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718901.1720126-2321-146360548988640/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:43 compute-0 python3.9[182510]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:43 compute-0 python3.9[182631]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718902.5958943-2321-151221497690591/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:44 compute-0 python3.9[182781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:45 compute-0 python3.9[182902]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718903.8825545-2321-209300888922266/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:47 compute-0 sudo[183052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygopzaunrfidpnazolearnkbnztpyoir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718906.7104745-2487-175507540686839/AnsiballZ_file.py'
Dec 02 23:41:47 compute-0 sudo[183052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:47 compute-0 python3.9[183054]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:47 compute-0 sudo[183052]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:47 compute-0 sudo[183204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-misndepbkwgoqghdmacvbwysslvffeoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718907.6164582-2503-154178746098296/AnsiballZ_copy.py'
Dec 02 23:41:47 compute-0 sudo[183204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:48 compute-0 python3.9[183206]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:41:48 compute-0 sudo[183204]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:48 compute-0 sudo[183356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hecvlrackjaxdvykjgbjrcacmiaqeggk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718908.530255-2519-36738000481619/AnsiballZ_stat.py'
Dec 02 23:41:48 compute-0 sudo[183356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:49 compute-0 python3.9[183358]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:41:49 compute-0 sudo[183356]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:49 compute-0 sudo[183508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paogcvdrrmczycxidadmdhbwchdcjfhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718909.5430374-2535-127044926566402/AnsiballZ_stat.py'
Dec 02 23:41:49 compute-0 sudo[183508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:50 compute-0 python3.9[183510]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:50 compute-0 sudo[183508]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:50 compute-0 sudo[183647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sijhcpagllubgnguqafawuozbdglrtax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718909.5430374-2535-127044926566402/AnsiballZ_copy.py'
Dec 02 23:41:50 compute-0 sudo[183647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:50 compute-0 podman[183605]: 2025-12-02 23:41:50.523730981 +0000 UTC m=+0.057008672 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:41:50 compute-0 python3.9[183653]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764718909.5430374-2535-127044926566402/.source _original_basename=.i_o_5zlo follow=False checksum=db76a38606333bc7e27a07e76be452f91b2c5d1e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec 02 23:41:50 compute-0 sudo[183647]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:51 compute-0 python3.9[183805]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:41:52 compute-0 python3.9[183957]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:53 compute-0 python3.9[184078]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718912.152754-2587-186377788898515/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=be63b1bdae8b60cf07c8ce2aab749fcc5ff45b00 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:54 compute-0 python3.9[184228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:41:54 compute-0 python3.9[184349]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718913.615007-2617-266307869668733/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=b86e9600018c7097ad57dbba089fc76217333398 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:41:55 compute-0 sudo[184501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlxmrcxtjoahlbpbhplfqbwxvdxtkekw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718915.4692907-2651-171575530561705/AnsiballZ_container_config_data.py'
Dec 02 23:41:55 compute-0 sudo[184501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:56 compute-0 python3.9[184503]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 02 23:41:56 compute-0 sudo[184501]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:56 compute-0 sshd-session[184374]: Received disconnect from 49.247.36.49 port 40032:11: Bye Bye [preauth]
Dec 02 23:41:56 compute-0 sshd-session[184374]: Disconnected from authenticating user root 49.247.36.49 port 40032 [preauth]
Dec 02 23:41:56 compute-0 sudo[184653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvztrpumlzydmlmeksbninenziccbshn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718916.521703-2669-46363659086249/AnsiballZ_container_config_hash.py'
Dec 02 23:41:56 compute-0 sudo[184653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:57 compute-0 python3.9[184655]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:41:57 compute-0 sudo[184653]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:57 compute-0 sudo[184805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anwfrwcagorpbriyeaddwtvxsbqewygf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718917.618449-2689-17596306061190/AnsiballZ_edpm_container_manage.py'
Dec 02 23:41:57 compute-0 sudo[184805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:58 compute-0 python3[184807]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:41:58 compute-0 podman[184844]: 2025-12-02 23:41:58.460572488 +0000 UTC m=+0.058201033 container create 9edc84ea7cbcfdaac32e33a0ffa524e16e33e7ad041c439363a8adee37cadb2f (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, container_name=nova_compute_init, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 23:41:58 compute-0 podman[184844]: 2025-12-02 23:41:58.431735869 +0000 UTC m=+0.029364454 image pull 99c98706e6d475ab9a9b50baf3431e8745aac38f98f776ef6ab7d3c7a2811699 38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Dec 02 23:41:58 compute-0 python3[184807]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 02 23:41:58 compute-0 sudo[184805]: pam_unix(sudo:session): session closed for user root
Dec 02 23:41:59 compute-0 sudo[185030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyyxqahxuxxgzeesuwfeomrbglasqtrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718919.0821092-2705-151425705969038/AnsiballZ_stat.py'
Dec 02 23:41:59 compute-0 sudo[185030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:41:59 compute-0 python3.9[185032]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:41:59 compute-0 sudo[185030]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:00 compute-0 sudo[185184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiexxyqhqxuktcxmymonfefgqprhefmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718920.3408482-2729-102420416502240/AnsiballZ_container_config_data.py'
Dec 02 23:42:00 compute-0 sudo[185184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:42:00.656 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:42:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:42:00.657 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:42:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:42:00.657 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:42:00 compute-0 python3.9[185186]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 02 23:42:00 compute-0 sudo[185184]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:01 compute-0 sudo[185337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xztwdezfijrldszzehqcgtayccofzisy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718921.3380566-2747-3540152767134/AnsiballZ_container_config_hash.py'
Dec 02 23:42:01 compute-0 sudo[185337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:01 compute-0 python3.9[185339]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:42:01 compute-0 sudo[185337]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:02 compute-0 sudo[185489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mziiaybwtfvmepaytvjaxvsyoufjndin ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764718922.415287-2767-108194431378445/AnsiballZ_edpm_container_manage.py'
Dec 02 23:42:02 compute-0 sudo[185489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:02 compute-0 python3[185491]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:42:03 compute-0 podman[185527]: 2025-12-02 23:42:03.164794559 +0000 UTC m=+0.046094635 container create e93a18cec83d9338bd7ad557ff98d9a606c5e06149d82795dc42840708f3374c (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, io.buildah.version=1.41.4)
Dec 02 23:42:03 compute-0 podman[185527]: 2025-12-02 23:42:03.140586251 +0000 UTC m=+0.021886327 image pull 99c98706e6d475ab9a9b50baf3431e8745aac38f98f776ef6ab7d3c7a2811699 38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Dec 02 23:42:03 compute-0 python3[185491]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest kolla_start
Dec 02 23:42:03 compute-0 sudo[185489]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:04 compute-0 sudo[185715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgxdblvcyjgworvtrpezdbwqhlepnwyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718923.7324498-2783-48831833690958/AnsiballZ_stat.py'
Dec 02 23:42:04 compute-0 sudo[185715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:04 compute-0 python3.9[185717]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:04 compute-0 sudo[185715]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:05 compute-0 sudo[185880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbfopuiwwoximoctrzkvivwvnixywtod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718924.722757-2801-175381413971788/AnsiballZ_file.py'
Dec 02 23:42:05 compute-0 sudo[185880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:05 compute-0 podman[185843]: 2025-12-02 23:42:05.136123897 +0000 UTC m=+0.111045496 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 02 23:42:05 compute-0 python3.9[185886]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:05 compute-0 sudo[185880]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:05 compute-0 sudo[186056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhtvzlglypahakejgfwjaqwzdftjfxwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718925.4216418-2801-52465129042649/AnsiballZ_copy.py'
Dec 02 23:42:05 compute-0 sudo[186056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:05 compute-0 podman[186018]: 2025-12-02 23:42:05.944474856 +0000 UTC m=+0.075298994 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Dec 02 23:42:06 compute-0 python3.9[186062]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764718925.4216418-2801-52465129042649/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:06 compute-0 sudo[186056]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:06 compute-0 sudo[186136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjlnszpoteghwmjhircjrgmrtlgwlcvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718925.4216418-2801-52465129042649/AnsiballZ_systemd.py'
Dec 02 23:42:06 compute-0 sudo[186136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:06 compute-0 python3.9[186138]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:42:06 compute-0 systemd[1]: Reloading.
Dec 02 23:42:06 compute-0 systemd-rc-local-generator[186163]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:42:06 compute-0 systemd-sysv-generator[186168]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:42:07 compute-0 sudo[186136]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:07 compute-0 sudo[186247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkjzjtvqntkcqnmgixmxhaazbccasyxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718925.4216418-2801-52465129042649/AnsiballZ_systemd.py'
Dec 02 23:42:07 compute-0 sudo[186247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:07 compute-0 python3.9[186249]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:42:07 compute-0 systemd[1]: Reloading.
Dec 02 23:42:07 compute-0 systemd-rc-local-generator[186278]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:42:07 compute-0 systemd-sysv-generator[186282]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:42:08 compute-0 systemd[1]: Starting nova_compute container...
Dec 02 23:42:08 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:42:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463a92acc9cc76096cd68d27063043ecb130df27d77d800ec73daee357f1782d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463a92acc9cc76096cd68d27063043ecb130df27d77d800ec73daee357f1782d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463a92acc9cc76096cd68d27063043ecb130df27d77d800ec73daee357f1782d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463a92acc9cc76096cd68d27063043ecb130df27d77d800ec73daee357f1782d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463a92acc9cc76096cd68d27063043ecb130df27d77d800ec73daee357f1782d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:08 compute-0 podman[186288]: 2025-12-02 23:42:08.296335657 +0000 UTC m=+0.140601831 container init e93a18cec83d9338bd7ad557ff98d9a606c5e06149d82795dc42840708f3374c (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:42:08 compute-0 podman[186288]: 2025-12-02 23:42:08.314305258 +0000 UTC m=+0.158571432 container start e93a18cec83d9338bd7ad557ff98d9a606c5e06149d82795dc42840708f3374c (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute)
Dec 02 23:42:08 compute-0 podman[186288]: nova_compute
Dec 02 23:42:08 compute-0 nova_compute[186303]: + sudo -E kolla_set_configs
Dec 02 23:42:08 compute-0 systemd[1]: Started nova_compute container.
Dec 02 23:42:08 compute-0 sudo[186247]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Validating config file
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Copying service configuration files
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Deleting /etc/ceph
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Creating directory /etc/ceph
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Setting permission for /etc/ceph
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Writing out command to execute
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:08 compute-0 nova_compute[186303]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 23:42:08 compute-0 nova_compute[186303]: ++ cat /run_command
Dec 02 23:42:08 compute-0 nova_compute[186303]: + CMD=nova-compute
Dec 02 23:42:08 compute-0 nova_compute[186303]: + ARGS=
Dec 02 23:42:08 compute-0 nova_compute[186303]: + sudo kolla_copy_cacerts
Dec 02 23:42:08 compute-0 nova_compute[186303]: + [[ ! -n '' ]]
Dec 02 23:42:08 compute-0 nova_compute[186303]: + . kolla_extend_start
Dec 02 23:42:08 compute-0 nova_compute[186303]: Running command: 'nova-compute'
Dec 02 23:42:08 compute-0 nova_compute[186303]: + echo 'Running command: '\''nova-compute'\'''
Dec 02 23:42:08 compute-0 nova_compute[186303]: + umask 0022
Dec 02 23:42:08 compute-0 nova_compute[186303]: + exec nova-compute
Dec 02 23:42:09 compute-0 python3.9[186464]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:10 compute-0 nova_compute[186303]: 2025-12-02 23:42:10.452 186307 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 02 23:42:10 compute-0 nova_compute[186303]: 2025-12-02 23:42:10.452 186307 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 02 23:42:10 compute-0 nova_compute[186303]: 2025-12-02 23:42:10.452 186307 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 02 23:42:10 compute-0 nova_compute[186303]: 2025-12-02 23:42:10.453 186307 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 02 23:42:10 compute-0 nova_compute[186303]: 2025-12-02 23:42:10.610 186307 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:42:10 compute-0 nova_compute[186303]: 2025-12-02 23:42:10.642 186307 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:42:10 compute-0 nova_compute[186303]: 2025-12-02 23:42:10.643 186307 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Dec 02 23:42:10 compute-0 nova_compute[186303]: 2025-12-02 23:42:10.681 186307 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Dec 02 23:42:10 compute-0 nova_compute[186303]: 2025-12-02 23:42:10.683 186307 WARNING oslo_config.cfg [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Dec 02 23:42:10 compute-0 python3.9[186617]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:11 compute-0 nova_compute[186303]: 2025-12-02 23:42:11.898 186307 INFO nova.virt.driver [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 02 23:42:11 compute-0 nova_compute[186303]: 2025-12-02 23:42:11.981 186307 INFO nova.compute.provider_config [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 02 23:42:11 compute-0 python3.9[186767]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.488 186307 DEBUG oslo_concurrency.lockutils [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.489 186307 DEBUG oslo_concurrency.lockutils [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.489 186307 DEBUG oslo_concurrency.lockutils [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.490 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.490 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.490 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.491 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.491 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.492 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.492 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.492 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.492 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.493 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.493 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.493 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.493 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.494 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.494 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.494 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.495 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.495 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.495 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.495 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.496 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.496 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.496 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.496 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.497 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.497 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.497 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.497 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.498 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.498 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.498 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.498 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.499 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.499 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.499 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.499 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.500 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.500 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.500 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.500 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.501 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.501 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.501 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.502 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.502 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.502 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.503 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.503 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.503 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.503 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.504 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.504 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.504 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.504 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.505 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.505 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.505 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.506 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.506 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.506 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.506 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.507 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.507 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.507 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.508 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.508 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.508 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.508 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.508 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.509 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.509 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.509 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.509 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.510 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.510 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.510 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.510 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.511 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.511 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.511 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.511 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.512 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.512 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.512 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.513 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.513 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.513 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.513 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.514 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.514 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.514 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.514 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.515 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.515 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.515 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.515 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.516 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.516 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.516 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.516 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.516 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.517 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.517 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.517 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.518 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.518 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.518 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.518 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.519 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.519 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.519 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.519 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.520 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.520 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.520 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.520 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.521 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.521 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.521 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.521 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.522 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.522 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.522 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.523 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.523 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.523 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.523 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.523 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.524 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.524 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.524 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.525 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.525 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.525 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.525 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.526 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.526 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.526 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.526 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.527 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.527 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.527 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.527 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.528 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.528 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.528 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.529 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.529 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.529 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.529 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.530 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.530 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.530 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.531 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.531 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.531 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.531 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.532 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.532 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.532 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.532 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.533 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.533 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.533 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.534 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.534 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.534 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.534 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.535 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.535 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.535 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.535 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.536 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.536 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.536 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.536 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.537 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.537 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.537 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.538 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.538 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.538 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.538 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.539 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.539 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.539 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.539 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.539 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.539 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.540 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.540 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.540 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.540 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.540 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.540 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.541 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.541 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.541 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.541 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.541 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.541 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.542 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.542 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.542 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.542 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.542 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.543 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.543 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.543 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.543 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.543 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.543 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.544 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.544 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.544 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.544 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.544 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.544 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.545 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.545 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.545 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.545 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.545 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.546 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.546 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.546 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.546 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.546 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.546 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.547 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.547 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.547 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.547 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.547 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.547 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.548 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.548 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.548 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.548 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.548 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.548 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.549 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.549 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.549 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.549 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.549 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.550 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.550 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.550 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.550 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.550 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.550 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.551 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.551 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.551 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.551 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.551 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.551 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.552 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.552 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.552 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.552 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.552 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.553 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.553 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.553 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.553 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.553 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.553 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.554 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.554 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.554 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.554 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.554 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.554 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.555 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.555 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.555 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.555 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.555 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.556 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.556 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.556 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.556 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.556 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.556 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.557 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.557 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.557 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.557 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.557 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.557 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.558 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.558 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.558 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.558 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.558 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.558 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.559 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.559 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.559 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.559 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.559 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.559 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.560 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.560 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.560 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.560 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.560 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.560 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.561 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.561 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.561 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.561 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.561 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.562 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.562 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.562 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.562 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.562 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.563 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.563 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.563 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.563 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.563 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.563 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.564 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.564 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.564 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.564 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.564 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.566 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.566 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.566 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.566 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.567 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.567 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.567 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.567 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.567 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.567 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.568 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.568 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.568 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.568 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.568 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.568 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.569 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.569 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.569 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.569 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.569 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.570 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.570 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.570 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.570 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.570 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.570 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.571 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.571 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.571 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.571 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.571 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.571 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.571 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.572 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.572 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.572 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.572 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.572 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.572 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.573 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.573 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.573 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.573 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.573 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.574 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.575 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.575 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.575 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.576 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.576 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.576 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.576 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.576 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.576 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.577 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.577 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.577 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.577 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.577 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.577 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.578 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.578 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.578 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.578 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.578 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.578 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.579 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.579 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.579 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.579 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.579 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.579 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.579 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.579 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.580 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.580 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.580 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.580 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.580 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.580 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.580 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.580 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.581 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.581 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.581 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.581 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.581 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.581 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.581 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.581 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.582 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.582 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.582 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.582 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.582 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.582 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.582 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.582 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.582 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.583 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.583 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.583 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.583 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.583 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.583 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.583 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.583 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.583 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.584 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.584 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.584 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.584 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.584 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.584 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.584 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.584 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.585 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.585 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.585 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.585 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.585 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.585 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.585 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.585 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.585 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.586 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.586 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.586 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.586 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.586 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.586 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.586 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.586 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.587 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.587 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.587 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.587 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.587 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.587 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.587 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.587 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.588 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.588 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.588 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.588 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.588 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.588 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.588 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.588 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.589 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.589 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.589 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.589 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.589 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.589 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.589 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.589 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.589 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.590 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.590 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.590 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.590 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.590 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.590 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.590 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.590 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.591 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.591 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.591 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.591 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.591 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.591 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.591 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.591 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.591 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.592 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.592 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.592 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.592 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.592 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.592 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.592 186307 WARNING oslo_config.cfg [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 02 23:42:12 compute-0 nova_compute[186303]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 02 23:42:12 compute-0 nova_compute[186303]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 02 23:42:12 compute-0 nova_compute[186303]: and ``live_migration_inbound_addr`` respectively.
Dec 02 23:42:12 compute-0 nova_compute[186303]: ).  Its value may be silently ignored in the future.
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.593 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.593 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.593 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.593 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.593 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.593 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.594 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.594 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.594 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.594 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.594 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.594 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.594 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.594 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.594 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.595 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.595 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.595 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.595 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.595 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.595 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.595 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.595 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.596 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.596 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.596 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.596 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.596 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.596 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.596 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.596 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.596 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.597 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.597 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.597 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.597 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.597 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.597 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.597 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.597 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.598 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.598 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.598 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.598 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.598 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.598 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.598 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.598 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.599 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.599 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.599 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.599 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.599 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.599 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.599 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.599 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.600 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.600 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.600 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.600 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.600 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.600 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.600 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.600 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.600 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.601 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.601 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.601 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.601 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.601 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.601 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.601 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.601 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.601 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.602 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.602 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.602 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.602 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.602 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.602 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.602 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.602 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.602 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.603 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.603 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.603 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.603 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.603 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.603 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.603 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.603 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.604 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.604 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.604 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.604 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.604 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.604 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.604 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.604 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.605 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.605 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.605 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.605 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.605 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.605 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.605 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.605 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.605 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.606 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.606 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.606 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.606 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.606 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.606 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.606 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.606 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.606 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.607 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.607 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.607 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.607 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.607 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.607 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.607 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.607 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.607 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.608 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.608 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.608 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.608 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.608 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.608 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.608 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.608 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.609 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.609 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.609 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.609 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.609 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.609 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.609 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.610 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.610 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.610 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.610 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.610 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.610 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.610 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.610 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.611 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.611 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.611 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.611 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.611 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.611 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.611 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.611 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.611 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.612 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.612 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.612 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.612 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.612 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.612 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.612 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.612 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.613 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.613 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.613 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.613 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.613 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.613 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.613 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.613 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.613 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.614 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.614 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.614 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.614 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.614 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.614 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.614 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.614 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.614 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.615 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.615 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.615 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.615 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.615 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.615 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.615 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.616 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.616 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.616 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.616 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.616 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.616 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.616 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.616 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.617 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.617 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.617 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.617 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.617 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.617 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.617 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.617 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.618 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.618 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.618 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.618 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.618 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.618 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.618 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.618 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.619 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.619 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.619 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.619 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.619 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.619 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.619 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.619 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.620 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.620 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.620 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.620 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.620 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.620 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.620 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.620 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.620 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.621 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.621 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.621 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.621 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.621 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.621 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.621 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.621 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.621 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.622 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.622 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.622 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.622 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.622 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.622 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.622 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.622 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.622 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.623 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.623 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.623 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.623 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.623 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.623 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.623 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.623 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.624 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.624 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.624 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.624 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.624 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.624 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.624 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.625 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.625 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.625 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.625 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.625 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.625 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.625 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.625 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.625 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.626 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.626 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.626 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.626 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.626 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.626 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.626 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.626 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.626 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.627 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.627 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.627 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.627 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.627 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.627 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.627 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.627 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.628 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.628 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.628 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.628 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.628 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.628 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.628 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.628 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.629 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.629 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.629 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.629 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.629 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.629 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.629 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.629 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.629 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.630 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.630 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.630 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.630 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.630 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.630 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.630 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.630 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.631 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.631 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.631 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.631 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.631 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.631 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.631 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.631 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.631 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.632 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.632 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.632 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.632 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.632 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.632 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.632 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.632 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.632 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.633 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.633 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.633 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.633 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.633 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.633 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.633 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.633 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.634 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.634 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.634 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.634 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.634 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.634 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.634 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.634 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.635 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.635 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.635 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.635 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.635 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.635 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.635 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.635 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.635 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.636 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.636 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.636 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.636 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.636 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.636 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.636 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.636 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.637 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.637 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.637 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.637 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.637 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.637 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.637 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.637 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.637 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.638 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.638 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.638 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.638 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.638 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.638 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.638 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.638 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.638 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.639 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.639 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.639 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.639 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.639 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.639 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.639 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.639 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.640 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.640 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.640 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.640 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.640 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.640 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.640 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.640 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.641 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.641 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.641 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.641 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.641 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.641 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.641 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.641 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.641 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.642 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.642 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.642 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.642 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.642 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.642 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.642 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.642 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.643 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.643 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.643 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.643 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.643 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.643 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.643 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.643 186307 DEBUG oslo_service.backend._eventlet.service [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 02 23:42:12 compute-0 nova_compute[186303]: 2025-12-02 23:42:12.645 186307 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Dec 02 23:42:12 compute-0 sudo[186919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmgnkoueniguwrnuuiedqmnavzsevutd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718932.3578467-2921-227109464849468/AnsiballZ_podman_container.py'
Dec 02 23:42:12 compute-0 sudo[186919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:13 compute-0 python3.9[186921]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 02 23:42:13 compute-0 nova_compute[186303]: 2025-12-02 23:42:13.155 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Dec 02 23:42:13 compute-0 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:42:13 compute-0 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:42:13 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 02 23:42:13 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 02 23:42:13 compute-0 nova_compute[186303]: 2025-12-02 23:42:13.248 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f4a60f80fe0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Dec 02 23:42:13 compute-0 nova_compute[186303]: libvirt:  error : internal error: could not initialize domain event timer
Dec 02 23:42:13 compute-0 nova_compute[186303]: 2025-12-02 23:42:13.249 186307 WARNING nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Dec 02 23:42:13 compute-0 nova_compute[186303]: 2025-12-02 23:42:13.250 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f4a60f80fe0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Dec 02 23:42:13 compute-0 nova_compute[186303]: 2025-12-02 23:42:13.251 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Dec 02 23:42:13 compute-0 nova_compute[186303]: 2025-12-02 23:42:13.253 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Dec 02 23:42:13 compute-0 nova_compute[186303]: 2025-12-02 23:42:13.253 186307 INFO nova.utils [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] The default thread pool MainProcess.default is initialized
Dec 02 23:42:13 compute-0 nova_compute[186303]: 2025-12-02 23:42:13.254 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Dec 02 23:42:13 compute-0 nova_compute[186303]: 2025-12-02 23:42:13.254 186307 INFO nova.virt.libvirt.driver [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Connection event '1' reason 'None'
Dec 02 23:42:13 compute-0 sudo[186919]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:13 compute-0 nova_compute[186303]: 2025-12-02 23:42:13.760 186307 WARNING nova.virt.libvirt.driver [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 02 23:42:13 compute-0 nova_compute[186303]: 2025-12-02 23:42:13.761 186307 DEBUG nova.virt.libvirt.volume.mount [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 02 23:42:13 compute-0 sudo[187154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjqhhimjyrcsboblwvmevpsxipofyequ ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718933.6179743-2937-56107963364318/AnsiballZ_systemd.py'
Dec 02 23:42:13 compute-0 sudo[187154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.118 186307 INFO nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Libvirt host capabilities <capabilities>
Dec 02 23:42:14 compute-0 nova_compute[186303]: 
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <host>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <uuid>b18b06c3-ce10-4c45-b42d-5391c38cc9dc</uuid>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <cpu>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <arch>x86_64</arch>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model>EPYC-Rome-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <vendor>AMD</vendor>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <microcode version='16777317'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <signature family='23' model='49' stepping='0'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='x2apic'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='tsc-deadline'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='osxsave'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='hypervisor'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='tsc_adjust'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='spec-ctrl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='stibp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='arch-capabilities'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='cmp_legacy'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='topoext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='virt-ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='lbrv'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='tsc-scale'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='vmcb-clean'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='pause-filter'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='pfthreshold'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='svme-addr-chk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='rdctl-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='skip-l1dfl-vmentry'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='mds-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature name='pschange-mc-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <pages unit='KiB' size='4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <pages unit='KiB' size='2048'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <pages unit='KiB' size='1048576'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </cpu>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <power_management>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <suspend_mem/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <suspend_disk/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <suspend_hybrid/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </power_management>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <iommu support='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <migration_features>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <live/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <uri_transports>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <uri_transport>tcp</uri_transport>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <uri_transport>rdma</uri_transport>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </uri_transports>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </migration_features>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <topology>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <cells num='1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <cell id='0'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:           <memory unit='KiB'>7864316</memory>
Dec 02 23:42:14 compute-0 nova_compute[186303]:           <pages unit='KiB' size='4'>1966079</pages>
Dec 02 23:42:14 compute-0 nova_compute[186303]:           <pages unit='KiB' size='2048'>0</pages>
Dec 02 23:42:14 compute-0 nova_compute[186303]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 02 23:42:14 compute-0 nova_compute[186303]:           <distances>
Dec 02 23:42:14 compute-0 nova_compute[186303]:             <sibling id='0' value='10'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:           </distances>
Dec 02 23:42:14 compute-0 nova_compute[186303]:           <cpus num='8'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:           </cpus>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         </cell>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </cells>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </topology>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <cache>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </cache>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <secmodel>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model>selinux</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <doi>0</doi>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </secmodel>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <secmodel>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model>dac</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <doi>0</doi>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </secmodel>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </host>
Dec 02 23:42:14 compute-0 nova_compute[186303]: 
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <guest>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <os_type>hvm</os_type>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <arch name='i686'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <wordsize>32</wordsize>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <domain type='qemu'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <domain type='kvm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </arch>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <features>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <pae/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <nonpae/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <acpi default='on' toggle='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <apic default='on' toggle='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <cpuselection/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <deviceboot/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <disksnapshot default='on' toggle='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <externalSnapshot/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </features>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </guest>
Dec 02 23:42:14 compute-0 nova_compute[186303]: 
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <guest>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <os_type>hvm</os_type>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <arch name='x86_64'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <wordsize>64</wordsize>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <domain type='qemu'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <domain type='kvm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </arch>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <features>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <acpi default='on' toggle='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <apic default='on' toggle='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <cpuselection/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <deviceboot/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <disksnapshot default='on' toggle='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <externalSnapshot/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </features>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </guest>
Dec 02 23:42:14 compute-0 nova_compute[186303]: 
Dec 02 23:42:14 compute-0 nova_compute[186303]: </capabilities>
Dec 02 23:42:14 compute-0 nova_compute[186303]: 
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.127 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.156 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 02 23:42:14 compute-0 nova_compute[186303]: <domainCapabilities>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <domain>kvm</domain>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <arch>i686</arch>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <vcpu max='240'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <iothreads supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <os supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <enum name='firmware'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <loader supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>rom</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pflash</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='readonly'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>yes</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>no</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='secure'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>no</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </loader>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </os>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <cpu>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>on</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>off</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='maximumMigratable'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>on</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>off</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <vendor>AMD</vendor>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='succor'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='custom' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cooperlake'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='GraniteRapids'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10-128'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10-256'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10-512'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='KnightsMill'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SierraForest'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='athlon'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='athlon-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='core2duo'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='core2duo-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='coreduo'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='coreduo-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='n270'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='n270-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='phenom'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='phenom-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </cpu>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <memoryBacking supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <enum name='sourceType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>file</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>anonymous</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>memfd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </memoryBacking>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <devices>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <disk supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='diskDevice'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>disk</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>cdrom</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>floppy</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>lun</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='bus'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>ide</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>fdc</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>scsi</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>usb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>sata</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </disk>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <graphics supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vnc</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>egl-headless</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>dbus</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </graphics>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <video supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='modelType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vga</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>cirrus</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>none</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>bochs</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>ramfb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </video>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <hostdev supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='mode'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>subsystem</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='startupPolicy'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>default</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>mandatory</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>requisite</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>optional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='subsysType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>usb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pci</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>scsi</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='capsType'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='pciBackend'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </hostdev>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <rng supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>random</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>egd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>builtin</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </rng>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <filesystem supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='driverType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>path</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>handle</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtiofs</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </filesystem>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <tpm supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tpm-tis</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tpm-crb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>emulator</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>external</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendVersion'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>2.0</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </tpm>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <redirdev supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='bus'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>usb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </redirdev>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <channel supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pty</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>unix</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </channel>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <crypto supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>qemu</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>builtin</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </crypto>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <interface supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>default</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>passt</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </interface>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <panic supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>isa</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>hyperv</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </panic>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <console supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>null</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vc</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pty</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>dev</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>file</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pipe</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>stdio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>udp</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tcp</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>unix</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>qemu-vdagent</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>dbus</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </console>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </devices>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <features>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <gic supported='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <genid supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <backup supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <async-teardown supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <ps2 supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <sev supported='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <sgx supported='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <hyperv supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='features'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>relaxed</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vapic</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>spinlocks</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vpindex</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>runtime</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>synic</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>stimer</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>reset</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vendor_id</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>frequencies</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>reenlightenment</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tlbflush</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>ipi</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>avic</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>emsr_bitmap</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>xmm_input</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <defaults>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </defaults>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </hyperv>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <launchSecurity supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='sectype'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tdx</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </launchSecurity>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </features>
Dec 02 23:42:14 compute-0 nova_compute[186303]: </domainCapabilities>
Dec 02 23:42:14 compute-0 nova_compute[186303]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.165 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 02 23:42:14 compute-0 nova_compute[186303]: <domainCapabilities>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <domain>kvm</domain>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <arch>i686</arch>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <vcpu max='4096'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <iothreads supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <os supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <enum name='firmware'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <loader supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>rom</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pflash</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='readonly'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>yes</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>no</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='secure'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>no</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </loader>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </os>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <cpu>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>on</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>off</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='maximumMigratable'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>on</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>off</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <vendor>AMD</vendor>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='succor'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='custom' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cooperlake'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='GraniteRapids'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10-128'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10-256'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10-512'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='KnightsMill'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 python3.9[187156]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SierraForest'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='athlon'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='athlon-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='core2duo'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='core2duo-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='coreduo'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='coreduo-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='n270'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='n270-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='phenom'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='phenom-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </cpu>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <memoryBacking supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <enum name='sourceType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>file</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>anonymous</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>memfd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </memoryBacking>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <devices>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <disk supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='diskDevice'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>disk</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>cdrom</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>floppy</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>lun</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='bus'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>fdc</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>scsi</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>usb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>sata</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </disk>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <graphics supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vnc</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>egl-headless</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>dbus</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </graphics>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <video supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='modelType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vga</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>cirrus</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>none</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>bochs</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>ramfb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </video>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <hostdev supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='mode'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>subsystem</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='startupPolicy'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>default</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>mandatory</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>requisite</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>optional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='subsysType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>usb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pci</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>scsi</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='capsType'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='pciBackend'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </hostdev>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <rng supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>random</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>egd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>builtin</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </rng>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <filesystem supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='driverType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>path</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>handle</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtiofs</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </filesystem>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <tpm supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tpm-tis</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tpm-crb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>emulator</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>external</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendVersion'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>2.0</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </tpm>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <redirdev supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='bus'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>usb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </redirdev>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <channel supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pty</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>unix</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </channel>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <crypto supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>qemu</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>builtin</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </crypto>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <interface supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>default</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>passt</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </interface>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <panic supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>isa</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>hyperv</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </panic>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <console supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>null</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vc</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pty</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>dev</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>file</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pipe</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>stdio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>udp</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tcp</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>unix</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>qemu-vdagent</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>dbus</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </console>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </devices>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <features>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <gic supported='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <genid supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <backup supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <async-teardown supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <ps2 supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <sev supported='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <sgx supported='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <hyperv supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='features'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>relaxed</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vapic</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>spinlocks</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vpindex</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>runtime</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>synic</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>stimer</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>reset</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vendor_id</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>frequencies</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>reenlightenment</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tlbflush</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>ipi</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>avic</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>emsr_bitmap</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>xmm_input</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <defaults>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </defaults>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </hyperv>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <launchSecurity supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='sectype'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tdx</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </launchSecurity>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </features>
Dec 02 23:42:14 compute-0 nova_compute[186303]: </domainCapabilities>
Dec 02 23:42:14 compute-0 nova_compute[186303]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.206 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.209 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 02 23:42:14 compute-0 nova_compute[186303]: <domainCapabilities>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <domain>kvm</domain>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <arch>x86_64</arch>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <vcpu max='240'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <iothreads supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <os supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <enum name='firmware'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <loader supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>rom</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pflash</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='readonly'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>yes</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>no</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='secure'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>no</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </loader>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </os>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <cpu>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>on</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>off</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='maximumMigratable'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>on</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>off</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <vendor>AMD</vendor>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='succor'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='custom' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cooperlake'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='GraniteRapids'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10-128'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10-256'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10-512'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='KnightsMill'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 systemd[1]: Stopping nova_compute container...
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SierraForest'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='athlon'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='athlon-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='core2duo'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='core2duo-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='coreduo'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='coreduo-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='n270'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='n270-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='phenom'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='phenom-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </cpu>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <memoryBacking supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <enum name='sourceType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>file</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>anonymous</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>memfd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </memoryBacking>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <devices>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <disk supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='diskDevice'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>disk</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>cdrom</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>floppy</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>lun</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='bus'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>ide</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>fdc</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>scsi</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>usb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>sata</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </disk>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <graphics supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vnc</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>egl-headless</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>dbus</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </graphics>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <video supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='modelType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vga</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>cirrus</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>none</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>bochs</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>ramfb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </video>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <hostdev supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='mode'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>subsystem</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='startupPolicy'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>default</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>mandatory</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>requisite</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>optional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='subsysType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>usb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pci</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>scsi</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='capsType'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='pciBackend'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </hostdev>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <rng supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>random</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>egd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>builtin</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </rng>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <filesystem supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='driverType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>path</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>handle</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtiofs</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </filesystem>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <tpm supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tpm-tis</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tpm-crb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>emulator</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>external</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendVersion'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>2.0</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </tpm>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <redirdev supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='bus'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>usb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </redirdev>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <channel supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pty</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>unix</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </channel>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <crypto supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>qemu</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>builtin</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </crypto>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <interface supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>default</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>passt</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </interface>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <panic supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>isa</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>hyperv</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </panic>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <console supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>null</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vc</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pty</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>dev</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>file</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pipe</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>stdio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>udp</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tcp</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>unix</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>qemu-vdagent</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>dbus</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </console>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </devices>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <features>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <gic supported='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <genid supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <backup supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <async-teardown supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <ps2 supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <sev supported='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <sgx supported='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <hyperv supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='features'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>relaxed</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vapic</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>spinlocks</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vpindex</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>runtime</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>synic</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>stimer</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>reset</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vendor_id</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>frequencies</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>reenlightenment</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tlbflush</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>ipi</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>avic</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>emsr_bitmap</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>xmm_input</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <defaults>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </defaults>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </hyperv>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <launchSecurity supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='sectype'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tdx</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </launchSecurity>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </features>
Dec 02 23:42:14 compute-0 nova_compute[186303]: </domainCapabilities>
Dec 02 23:42:14 compute-0 nova_compute[186303]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.282 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 02 23:42:14 compute-0 nova_compute[186303]: <domainCapabilities>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <domain>kvm</domain>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <arch>x86_64</arch>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <vcpu max='4096'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <iothreads supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <os supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <enum name='firmware'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>efi</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <loader supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>rom</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pflash</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='readonly'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>yes</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>no</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='secure'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>yes</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>no</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </loader>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </os>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <cpu>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>on</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>off</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='maximumMigratable'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>on</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>off</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <vendor>AMD</vendor>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='succor'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <mode name='custom' supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cooperlake'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Denverton-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='auto-ibrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amd-psfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='stibp-always-on'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='EPYC-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='GraniteRapids'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10-128'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10-256'/>
Dec 02 23:42:14 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx10-512'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='prefetchiti'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Haswell-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='KnightsMill'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512er'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512pf'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fma4'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tbm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xop'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='amx-tile'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-bf16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-fp16'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bitalg'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrc'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fzrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='la57'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='taa-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xfd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SierraForest'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ifma'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cmpccxadd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fbsdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='fsrs'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ibrs-all'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mcdt-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pbrsb-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='psdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='serialize'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vaes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='hle'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='rtm'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512bw'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512cd'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512dq'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512f'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='avx512vl'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='invpcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pcid'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='pku'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='mpx'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='core-capability'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='split-lock-detect'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='cldemote'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='erms'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='gfni'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdir64b'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='movdiri'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='xsaves'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='athlon'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='athlon-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='core2duo'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='core2duo-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='coreduo'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='coreduo-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='n270'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='n270-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='ss'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='phenom'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <blockers model='phenom-v1'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnow'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <feature name='3dnowext'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </blockers>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </mode>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </cpu>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <memoryBacking supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <enum name='sourceType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>file</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>anonymous</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <value>memfd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </memoryBacking>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <devices>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <disk supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='diskDevice'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>disk</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>cdrom</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>floppy</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>lun</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='bus'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>fdc</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>scsi</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>usb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>sata</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </disk>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <graphics supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vnc</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>egl-headless</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>dbus</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </graphics>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <video supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='modelType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vga</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>cirrus</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>none</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>bochs</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>ramfb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </video>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <hostdev supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='mode'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>subsystem</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='startupPolicy'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>default</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>mandatory</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>requisite</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>optional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='subsysType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>usb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pci</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>scsi</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='capsType'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='pciBackend'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </hostdev>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <rng supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtio-non-transitional</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>random</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>egd</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>builtin</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </rng>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <filesystem supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='driverType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>path</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>handle</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>virtiofs</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </filesystem>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <tpm supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tpm-tis</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tpm-crb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>emulator</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>external</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendVersion'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>2.0</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </tpm>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <redirdev supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='bus'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>usb</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </redirdev>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <channel supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pty</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>unix</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </channel>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <crypto supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>qemu</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendModel'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>builtin</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </crypto>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <interface supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='backendType'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>default</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>passt</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </interface>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <panic supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='model'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>isa</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>hyperv</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </panic>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <console supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='type'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>null</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vc</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pty</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>dev</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>file</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>pipe</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>stdio</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>udp</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tcp</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>unix</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>qemu-vdagent</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>dbus</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </console>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </devices>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   <features>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <gic supported='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <genid supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <backup supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <async-teardown supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <ps2 supported='yes'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <sev supported='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <sgx supported='no'/>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <hyperv supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='features'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>relaxed</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vapic</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>spinlocks</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vpindex</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>runtime</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>synic</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>stimer</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>reset</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>vendor_id</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>frequencies</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>reenlightenment</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tlbflush</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>ipi</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>avic</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>emsr_bitmap</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>xmm_input</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <defaults>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </defaults>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </hyperv>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     <launchSecurity supported='yes'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       <enum name='sectype'>
Dec 02 23:42:14 compute-0 nova_compute[186303]:         <value>tdx</value>
Dec 02 23:42:14 compute-0 nova_compute[186303]:       </enum>
Dec 02 23:42:14 compute-0 nova_compute[186303]:     </launchSecurity>
Dec 02 23:42:14 compute-0 nova_compute[186303]:   </features>
Dec 02 23:42:14 compute-0 nova_compute[186303]: </domainCapabilities>
Dec 02 23:42:14 compute-0 nova_compute[186303]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.346 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.347 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.347 186307 DEBUG nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.347 186307 INFO nova.virt.libvirt.host [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] Secure Boot support detected
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.357 186307 INFO nova.virt.libvirt.driver [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.357 186307 INFO nova.virt.libvirt.driver [None req-5beb3c9d-6646-4ee9-8565-cbb3f677c72f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.470 186307 DEBUG oslo_concurrency.lockutils [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.471 186307 DEBUG oslo_concurrency.lockutils [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:42:14 compute-0 nova_compute[186303]: 2025-12-02 23:42:14.471 186307 DEBUG oslo_concurrency.lockutils [None req-187b7577-14ad-4ed6-9425-d0f8c4e5bead - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:42:15 compute-0 virtqemud[186944]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 02 23:42:15 compute-0 virtqemud[186944]: hostname: compute-0
Dec 02 23:42:15 compute-0 virtqemud[186944]: End of file while reading data: Input/output error
Dec 02 23:42:15 compute-0 systemd[1]: libpod-e93a18cec83d9338bd7ad557ff98d9a606c5e06149d82795dc42840708f3374c.scope: Deactivated successfully.
Dec 02 23:42:15 compute-0 systemd[1]: libpod-e93a18cec83d9338bd7ad557ff98d9a606c5e06149d82795dc42840708f3374c.scope: Consumed 3.262s CPU time.
Dec 02 23:42:15 compute-0 podman[187164]: 2025-12-02 23:42:15.043439722 +0000 UTC m=+0.684261481 container died e93a18cec83d9338bd7ad557ff98d9a606c5e06149d82795dc42840708f3374c (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 23:42:15 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e93a18cec83d9338bd7ad557ff98d9a606c5e06149d82795dc42840708f3374c-userdata-shm.mount: Deactivated successfully.
Dec 02 23:42:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-463a92acc9cc76096cd68d27063043ecb130df27d77d800ec73daee357f1782d-merged.mount: Deactivated successfully.
Dec 02 23:42:15 compute-0 podman[187164]: 2025-12-02 23:42:15.109845907 +0000 UTC m=+0.750667676 container cleanup e93a18cec83d9338bd7ad557ff98d9a606c5e06149d82795dc42840708f3374c (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:42:15 compute-0 podman[187164]: nova_compute
Dec 02 23:42:15 compute-0 podman[187214]: nova_compute
Dec 02 23:42:15 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 02 23:42:15 compute-0 systemd[1]: Stopped nova_compute container.
Dec 02 23:42:15 compute-0 systemd[1]: Starting nova_compute container...
Dec 02 23:42:15 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:42:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463a92acc9cc76096cd68d27063043ecb130df27d77d800ec73daee357f1782d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463a92acc9cc76096cd68d27063043ecb130df27d77d800ec73daee357f1782d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463a92acc9cc76096cd68d27063043ecb130df27d77d800ec73daee357f1782d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463a92acc9cc76096cd68d27063043ecb130df27d77d800ec73daee357f1782d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463a92acc9cc76096cd68d27063043ecb130df27d77d800ec73daee357f1782d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:15 compute-0 podman[187227]: 2025-12-02 23:42:15.318999238 +0000 UTC m=+0.118363757 container init e93a18cec83d9338bd7ad557ff98d9a606c5e06149d82795dc42840708f3374c (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.4, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:42:15 compute-0 podman[187227]: 2025-12-02 23:42:15.32737423 +0000 UTC m=+0.126738709 container start e93a18cec83d9338bd7ad557ff98d9a606c5e06149d82795dc42840708f3374c (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_build_tag=watcher_latest, container_name=nova_compute, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:42:15 compute-0 podman[187227]: nova_compute
Dec 02 23:42:15 compute-0 nova_compute[187243]: + sudo -E kolla_set_configs
Dec 02 23:42:15 compute-0 systemd[1]: Started nova_compute container.
Dec 02 23:42:15 compute-0 sudo[187154]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Validating config file
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Copying service configuration files
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Deleting /etc/ceph
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Creating directory /etc/ceph
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Setting permission for /etc/ceph
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Writing out command to execute
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:15 compute-0 nova_compute[187243]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 02 23:42:15 compute-0 nova_compute[187243]: ++ cat /run_command
Dec 02 23:42:15 compute-0 nova_compute[187243]: + CMD=nova-compute
Dec 02 23:42:15 compute-0 nova_compute[187243]: + ARGS=
Dec 02 23:42:15 compute-0 nova_compute[187243]: + sudo kolla_copy_cacerts
Dec 02 23:42:15 compute-0 nova_compute[187243]: + [[ ! -n '' ]]
Dec 02 23:42:15 compute-0 nova_compute[187243]: + . kolla_extend_start
Dec 02 23:42:15 compute-0 nova_compute[187243]: Running command: 'nova-compute'
Dec 02 23:42:15 compute-0 nova_compute[187243]: + echo 'Running command: '\''nova-compute'\'''
Dec 02 23:42:15 compute-0 nova_compute[187243]: + umask 0022
Dec 02 23:42:15 compute-0 nova_compute[187243]: + exec nova-compute
Dec 02 23:42:16 compute-0 sudo[187404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwacruflvcavjddqcismcqbrdczbkzts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718935.7878473-2955-151503228967042/AnsiballZ_podman_container.py'
Dec 02 23:42:16 compute-0 sudo[187404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:16 compute-0 python3.9[187406]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 02 23:42:16 compute-0 systemd[1]: Started libpod-conmon-9edc84ea7cbcfdaac32e33a0ffa524e16e33e7ad041c439363a8adee37cadb2f.scope.
Dec 02 23:42:16 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:42:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35f377ca4ae043db0a56448c95f1e8cb31dd1aad6763b5e70a3f4a6c03bdf71d/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35f377ca4ae043db0a56448c95f1e8cb31dd1aad6763b5e70a3f4a6c03bdf71d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35f377ca4ae043db0a56448c95f1e8cb31dd1aad6763b5e70a3f4a6c03bdf71d/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 02 23:42:16 compute-0 podman[187430]: 2025-12-02 23:42:16.628177005 +0000 UTC m=+0.139705380 container init 9edc84ea7cbcfdaac32e33a0ffa524e16e33e7ad041c439363a8adee37cadb2f (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=edpm, container_name=nova_compute_init, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:42:16 compute-0 podman[187430]: 2025-12-02 23:42:16.639564623 +0000 UTC m=+0.151092988 container start 9edc84ea7cbcfdaac32e33a0ffa524e16e33e7ad041c439363a8adee37cadb2f (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 02 23:42:16 compute-0 python3.9[187406]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Applying nova statedir ownership
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 02 23:42:16 compute-0 nova_compute_init[187452]: INFO:nova_statedir:Nova statedir ownership complete
Dec 02 23:42:16 compute-0 systemd[1]: libpod-9edc84ea7cbcfdaac32e33a0ffa524e16e33e7ad041c439363a8adee37cadb2f.scope: Deactivated successfully.
Dec 02 23:42:16 compute-0 podman[187453]: 2025-12-02 23:42:16.722848786 +0000 UTC m=+0.045313417 container died 9edc84ea7cbcfdaac32e33a0ffa524e16e33e7ad041c439363a8adee37cadb2f (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Dec 02 23:42:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9edc84ea7cbcfdaac32e33a0ffa524e16e33e7ad041c439363a8adee37cadb2f-userdata-shm.mount: Deactivated successfully.
Dec 02 23:42:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-35f377ca4ae043db0a56448c95f1e8cb31dd1aad6763b5e70a3f4a6c03bdf71d-merged.mount: Deactivated successfully.
Dec 02 23:42:16 compute-0 podman[187466]: 2025-12-02 23:42:16.793245387 +0000 UTC m=+0.060256262 container cleanup 9edc84ea7cbcfdaac32e33a0ffa524e16e33e7ad041c439363a8adee37cadb2f (image=38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, config_data={'image': '38.102.83.2:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, managed_by=edpm_ansible)
Dec 02 23:42:16 compute-0 systemd[1]: libpod-conmon-9edc84ea7cbcfdaac32e33a0ffa524e16e33e7ad041c439363a8adee37cadb2f.scope: Deactivated successfully.
Dec 02 23:42:16 compute-0 sudo[187404]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:17 compute-0 sshd-session[159060]: Connection closed by 192.168.122.30 port 41530
Dec 02 23:42:17 compute-0 sshd-session[159057]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:42:17 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Dec 02 23:42:17 compute-0 systemd[1]: session-24.scope: Consumed 2min 6.329s CPU time.
Dec 02 23:42:17 compute-0 systemd-logind[795]: Session 24 logged out. Waiting for processes to exit.
Dec 02 23:42:17 compute-0 systemd-logind[795]: Removed session 24.
Dec 02 23:42:17 compute-0 nova_compute[187243]: 2025-12-02 23:42:17.413 187247 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 02 23:42:17 compute-0 nova_compute[187243]: 2025-12-02 23:42:17.413 187247 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 02 23:42:17 compute-0 nova_compute[187243]: 2025-12-02 23:42:17.413 187247 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 02 23:42:17 compute-0 nova_compute[187243]: 2025-12-02 23:42:17.413 187247 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 02 23:42:17 compute-0 nova_compute[187243]: 2025-12-02 23:42:17.534 187247 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:42:17 compute-0 nova_compute[187243]: 2025-12-02 23:42:17.561 187247 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:42:17 compute-0 nova_compute[187243]: 2025-12-02 23:42:17.562 187247 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Dec 02 23:42:17 compute-0 nova_compute[187243]: 2025-12-02 23:42:17.591 187247 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Dec 02 23:42:17 compute-0 nova_compute[187243]: 2025-12-02 23:42:17.592 187247 WARNING oslo_config.cfg [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Dec 02 23:42:18 compute-0 nova_compute[187243]: 2025-12-02 23:42:18.622 187247 INFO nova.virt.driver [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 02 23:42:18 compute-0 nova_compute[187243]: 2025-12-02 23:42:18.735 187247 INFO nova.compute.provider_config [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.242 187247 DEBUG oslo_concurrency.lockutils [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.243 187247 DEBUG oslo_concurrency.lockutils [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.244 187247 DEBUG oslo_concurrency.lockutils [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.245 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.245 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.245 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.246 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.246 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.246 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.247 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.247 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.247 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.248 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.248 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.248 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.249 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.249 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.249 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.250 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.250 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.250 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.250 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.251 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.251 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.251 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.251 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.252 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.252 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.252 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.253 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.253 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.253 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.254 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.254 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.254 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.254 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.255 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.255 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.255 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.256 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.256 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.256 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.256 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.257 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.257 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.257 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.258 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.258 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.258 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.258 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.259 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.259 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.259 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.259 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.260 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.260 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.260 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.261 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.261 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.261 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.262 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.262 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.262 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.263 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.263 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.263 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.264 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.264 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.264 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.264 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.265 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.265 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.265 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.265 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.266 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.266 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.266 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.267 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.267 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.267 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.267 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.268 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.268 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.268 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.268 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.269 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.269 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.269 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.270 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.270 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.270 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.270 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.271 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.271 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.271 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.271 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.272 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.272 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.272 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.272 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.273 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.273 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.273 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.274 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.274 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.274 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.274 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.275 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.275 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.275 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.276 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.276 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.276 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.277 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.277 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.277 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.277 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.278 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.278 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.278 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.278 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.279 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.279 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.279 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.280 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.280 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.280 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.280 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.281 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.281 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.281 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.281 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.282 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.282 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.282 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.282 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.283 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.283 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.283 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.284 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.284 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.284 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.284 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.285 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.285 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.285 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.285 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.286 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.286 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.286 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.287 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.287 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.287 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.287 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.288 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.288 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.288 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.289 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.289 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.289 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.290 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.290 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.290 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.290 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.291 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.291 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.291 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.291 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.292 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.292 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.292 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.292 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.293 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.293 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.293 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.293 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.293 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.294 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.294 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.294 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.294 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.294 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.294 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.295 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.295 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.295 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.295 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.295 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.296 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.296 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.296 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.296 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.296 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.296 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.297 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.297 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.297 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.297 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.297 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.297 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.298 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.298 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.298 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.298 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.298 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.299 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.299 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.299 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.299 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.299 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.299 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.300 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.300 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.300 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.300 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.300 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.301 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.301 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.301 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.301 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.301 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.301 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.302 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.302 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.302 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.302 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.302 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.302 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.303 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.303 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.303 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.303 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.303 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.304 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.304 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.304 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.304 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.304 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.305 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.305 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.305 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.305 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.305 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.305 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.306 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.306 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.306 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.306 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.306 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.306 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.307 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.307 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.308 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.309 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.309 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.309 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.309 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.310 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.310 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.310 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.310 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.310 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.310 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.311 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.311 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.311 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.311 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.311 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.311 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.312 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.312 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.312 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.312 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.312 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.313 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.313 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.313 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.313 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.313 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.313 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.314 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.314 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.314 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.314 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.314 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.315 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.315 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.315 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.315 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.315 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.316 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.316 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.316 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.316 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.316 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.317 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.317 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.317 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.317 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.317 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.318 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.318 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.318 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.318 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.318 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.318 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.319 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.319 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.319 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.319 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.319 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.319 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.320 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.320 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.320 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.320 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.320 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.320 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.321 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.321 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.321 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.321 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.321 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.321 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.322 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.322 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.322 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.322 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.322 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.323 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.323 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.323 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.323 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.323 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.323 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.324 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.324 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.324 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.324 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.324 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.324 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.325 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.325 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.325 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.325 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.325 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.325 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.326 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.326 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.326 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.326 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.326 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.327 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.327 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.327 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.327 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.327 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.327 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.328 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.328 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.328 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.328 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.328 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.328 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.329 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.329 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.329 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.329 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.329 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.329 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.329 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.329 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.329 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.330 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.330 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.330 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.330 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.330 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.330 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.330 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.331 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.331 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.331 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.331 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.331 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.331 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.331 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.331 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.332 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.332 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.332 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.332 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.332 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.332 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.332 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.332 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.333 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.333 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.333 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.333 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.333 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.333 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.333 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.333 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.333 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.334 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.334 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.334 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.334 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.334 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.334 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.334 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.334 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.335 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.335 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.335 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.335 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.335 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.335 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.335 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.335 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.336 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.336 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.336 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.336 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.336 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.336 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.336 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.336 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.336 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.337 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.337 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.337 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.337 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.337 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.337 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.337 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.337 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.338 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.338 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.338 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.338 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.338 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.338 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.338 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.338 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.339 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.339 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.339 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.339 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.339 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.339 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.339 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.339 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.340 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.340 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.340 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.340 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.340 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.340 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.340 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.340 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.340 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.341 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.341 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.341 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.341 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.341 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.341 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.341 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.342 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.342 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.342 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.342 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.342 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.342 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.342 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.342 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.343 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.343 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.343 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.343 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.343 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.343 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.343 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.343 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.344 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.344 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.344 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.344 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.344 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.344 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.344 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.345 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.345 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.345 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.345 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.345 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.345 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.345 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.345 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.346 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.346 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.346 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.346 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.346 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.346 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.346 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.346 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.347 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.347 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.347 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.347 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.347 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.347 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.347 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.348 187247 WARNING oslo_config.cfg [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 02 23:42:19 compute-0 nova_compute[187243]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 02 23:42:19 compute-0 nova_compute[187243]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 02 23:42:19 compute-0 nova_compute[187243]: and ``live_migration_inbound_addr`` respectively.
Dec 02 23:42:19 compute-0 nova_compute[187243]: ).  Its value may be silently ignored in the future.
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.348 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.348 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.348 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.348 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.349 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.349 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.349 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.349 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.349 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.349 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.349 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.350 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.350 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.350 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.350 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.350 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.350 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.350 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.350 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.350 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.351 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.351 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.351 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.351 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.351 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.351 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.351 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.352 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.352 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.352 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.352 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.352 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.352 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.352 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.353 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.353 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.353 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.353 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.353 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.353 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.353 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.354 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.354 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.354 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.354 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.354 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.354 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.354 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.354 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.355 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.355 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.355 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.355 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.355 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.355 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.355 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.355 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.356 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.356 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.356 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.356 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.356 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.356 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.357 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.357 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.357 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.357 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.357 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.357 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.358 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.358 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.358 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.358 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.358 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.358 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.359 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.359 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.359 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.359 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.359 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.360 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.360 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.360 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.360 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.360 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.360 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.361 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.361 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.361 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.361 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.361 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.361 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.361 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.362 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.362 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.362 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.362 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.362 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.362 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.363 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.363 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.363 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.363 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.363 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.363 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.363 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.363 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.364 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.364 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.364 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.364 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.364 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.364 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.364 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.364 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.365 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.365 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.365 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.365 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.365 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.365 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.365 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.366 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.366 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.366 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.366 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.366 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.366 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.366 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.366 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.367 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.367 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.367 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.367 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.367 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.367 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.367 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.367 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.367 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.368 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.368 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.368 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.368 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.368 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.368 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.368 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.369 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.369 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.369 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.369 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.369 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.369 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.369 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.369 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.370 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.370 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.370 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.370 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.370 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.370 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.370 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.370 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.371 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.371 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.371 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.371 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.371 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.371 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.371 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.371 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.372 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.372 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.372 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.372 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.372 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.372 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.373 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.373 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.373 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.373 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.373 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.373 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.373 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.374 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.374 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.374 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.374 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.374 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.374 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.375 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.375 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.375 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.375 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.375 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.375 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.375 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.375 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.376 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.376 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.376 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.376 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.376 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.376 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.376 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.376 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.377 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.377 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.377 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.377 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.377 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.377 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.377 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.377 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.378 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.378 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.378 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.378 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.378 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.378 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.378 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.379 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.379 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.379 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.379 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.379 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.379 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.379 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.379 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.379 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.380 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.380 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.380 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.380 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.380 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.380 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.380 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.381 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.381 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.381 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.381 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.381 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.381 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.381 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.381 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.381 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.382 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.382 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.382 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.382 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.382 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.382 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.382 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.382 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.383 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.383 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.383 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.383 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.383 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.383 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.383 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.384 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.384 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.384 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.384 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.384 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.384 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.384 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.385 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.385 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.385 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.385 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.385 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.385 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.385 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.386 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.386 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.386 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.386 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.386 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.386 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.386 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.387 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.387 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.387 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.387 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.387 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.387 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.387 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.387 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.388 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.388 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.388 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.388 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.388 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.388 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.388 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.389 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.389 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.389 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.389 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.389 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.389 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.389 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.389 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.390 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.390 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.390 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.390 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.390 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.390 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.390 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.390 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.390 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.391 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.391 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.391 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.391 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.391 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.391 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.391 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.391 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.392 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.392 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.392 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.392 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.392 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.392 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.392 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.392 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.393 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.393 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.393 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.393 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.393 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.393 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.393 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.393 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.394 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.394 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.394 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.394 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.394 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.394 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.394 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.394 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.395 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.395 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.395 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.395 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.395 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.395 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.395 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.395 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.396 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.396 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.396 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.396 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.396 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.396 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.396 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.396 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.397 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.397 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.397 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.397 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.397 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.397 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.397 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.397 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.397 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.398 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.398 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.398 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.398 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.398 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.398 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.398 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.398 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.399 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.399 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.399 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.399 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.399 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.399 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.399 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.399 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.400 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.400 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.400 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.400 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.400 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.400 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.400 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.400 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.401 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.401 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.401 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.401 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.401 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.401 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.401 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.401 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.402 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.402 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.402 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.402 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.402 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.402 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.402 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.402 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.403 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.403 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.403 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.403 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.403 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.403 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.403 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.403 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.403 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.404 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.404 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.404 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.404 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.404 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.404 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.404 187247 DEBUG oslo_service.backend._eventlet.service [None req-970b4cd2-e834-4004-b2ab-888d3127bea7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.405 187247 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.913 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.930 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f1d68d6ad50> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Dec 02 23:42:19 compute-0 nova_compute[187243]: libvirt:  error : internal error: could not initialize domain event timer
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.931 187247 WARNING nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.932 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f1d68d6ad50> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.934 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.935 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.936 187247 INFO nova.utils [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] The default thread pool MainProcess.default is initialized
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.936 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.937 187247 INFO nova.virt.libvirt.driver [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Connection event '1' reason 'None'
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.945 187247 INFO nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Libvirt host capabilities <capabilities>
Dec 02 23:42:19 compute-0 nova_compute[187243]: 
Dec 02 23:42:19 compute-0 nova_compute[187243]:   <host>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <uuid>b18b06c3-ce10-4c45-b42d-5391c38cc9dc</uuid>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <cpu>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <arch>x86_64</arch>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model>EPYC-Rome-v4</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <vendor>AMD</vendor>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <microcode version='16777317'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <signature family='23' model='49' stepping='0'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='x2apic'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='tsc-deadline'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='osxsave'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='hypervisor'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='tsc_adjust'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='spec-ctrl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='stibp'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='arch-capabilities'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='ssbd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='cmp_legacy'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='topoext'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='virt-ssbd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='lbrv'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='tsc-scale'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='vmcb-clean'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='pause-filter'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='pfthreshold'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='svme-addr-chk'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='rdctl-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='skip-l1dfl-vmentry'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='mds-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature name='pschange-mc-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <pages unit='KiB' size='4'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <pages unit='KiB' size='2048'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <pages unit='KiB' size='1048576'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </cpu>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <power_management>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <suspend_mem/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <suspend_disk/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <suspend_hybrid/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </power_management>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <iommu support='no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <migration_features>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <live/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <uri_transports>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <uri_transport>tcp</uri_transport>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <uri_transport>rdma</uri_transport>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </uri_transports>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </migration_features>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <topology>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <cells num='1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <cell id='0'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:           <memory unit='KiB'>7864316</memory>
Dec 02 23:42:19 compute-0 nova_compute[187243]:           <pages unit='KiB' size='4'>1966079</pages>
Dec 02 23:42:19 compute-0 nova_compute[187243]:           <pages unit='KiB' size='2048'>0</pages>
Dec 02 23:42:19 compute-0 nova_compute[187243]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 02 23:42:19 compute-0 nova_compute[187243]:           <distances>
Dec 02 23:42:19 compute-0 nova_compute[187243]:             <sibling id='0' value='10'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:           </distances>
Dec 02 23:42:19 compute-0 nova_compute[187243]:           <cpus num='8'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:           </cpus>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         </cell>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </cells>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </topology>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <cache>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </cache>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <secmodel>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model>selinux</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <doi>0</doi>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </secmodel>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <secmodel>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model>dac</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <doi>0</doi>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </secmodel>
Dec 02 23:42:19 compute-0 nova_compute[187243]:   </host>
Dec 02 23:42:19 compute-0 nova_compute[187243]: 
Dec 02 23:42:19 compute-0 nova_compute[187243]:   <guest>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <os_type>hvm</os_type>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <arch name='i686'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <wordsize>32</wordsize>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <domain type='qemu'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <domain type='kvm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </arch>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <features>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <pae/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <nonpae/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <acpi default='on' toggle='yes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <apic default='on' toggle='no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <cpuselection/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <deviceboot/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <disksnapshot default='on' toggle='no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <externalSnapshot/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </features>
Dec 02 23:42:19 compute-0 nova_compute[187243]:   </guest>
Dec 02 23:42:19 compute-0 nova_compute[187243]: 
Dec 02 23:42:19 compute-0 nova_compute[187243]:   <guest>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <os_type>hvm</os_type>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <arch name='x86_64'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <wordsize>64</wordsize>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <domain type='qemu'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <domain type='kvm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </arch>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <features>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <acpi default='on' toggle='yes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <apic default='on' toggle='no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <cpuselection/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <deviceboot/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <disksnapshot default='on' toggle='no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <externalSnapshot/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </features>
Dec 02 23:42:19 compute-0 nova_compute[187243]:   </guest>
Dec 02 23:42:19 compute-0 nova_compute[187243]: 
Dec 02 23:42:19 compute-0 nova_compute[187243]: </capabilities>
Dec 02 23:42:19 compute-0 nova_compute[187243]: 
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.954 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Dec 02 23:42:19 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.962 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 02 23:42:19 compute-0 nova_compute[187243]: <domainCapabilities>
Dec 02 23:42:19 compute-0 nova_compute[187243]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:19 compute-0 nova_compute[187243]:   <domain>kvm</domain>
Dec 02 23:42:19 compute-0 nova_compute[187243]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:19 compute-0 nova_compute[187243]:   <arch>i686</arch>
Dec 02 23:42:19 compute-0 nova_compute[187243]:   <vcpu max='4096'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:   <iothreads supported='yes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:   <os supported='yes'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <enum name='firmware'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <loader supported='yes'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <value>rom</value>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <value>pflash</value>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <enum name='readonly'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <value>yes</value>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <value>no</value>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <enum name='secure'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <value>no</value>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </loader>
Dec 02 23:42:19 compute-0 nova_compute[187243]:   </os>
Dec 02 23:42:19 compute-0 nova_compute[187243]:   <cpu>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <value>on</value>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <value>off</value>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <enum name='maximumMigratable'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <value>on</value>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <value>off</value>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <vendor>AMD</vendor>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='succor'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:19 compute-0 nova_compute[187243]:     <mode name='custom' supported='yes'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Broadwell'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Cooperlake'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Denverton'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Denverton-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Denverton-v2'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Denverton-v3'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amd-psfd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amd-psfd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amd-psfd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='EPYC-v3'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='EPYC-v4'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='GraniteRapids'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-fp16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='prefetchiti'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-fp16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='prefetchiti'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-fp16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx10'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx10-128'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx10-256'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx10-512'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='prefetchiti'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Haswell'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Haswell-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Haswell-v2'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Haswell-v3'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Haswell-v4'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='IvyBridge'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='KnightsMill'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512er'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512pf'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512er'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512pf'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Opteron_G4'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Opteron_G5'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='tbm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='tbm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:19 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:19 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SierraForest'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='athlon'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='athlon-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='core2duo'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='core2duo-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='coreduo'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='coreduo-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='n270'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='n270-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='phenom'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='phenom-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </cpu>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <memoryBacking supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <enum name='sourceType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>file</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>anonymous</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>memfd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </memoryBacking>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <devices>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <disk supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='diskDevice'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>disk</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>cdrom</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>floppy</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>lun</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='bus'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>fdc</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>scsi</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>usb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>sata</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <graphics supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vnc</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>egl-headless</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>dbus</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </graphics>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <video supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='modelType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vga</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>cirrus</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>none</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>bochs</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>ramfb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </video>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <hostdev supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='mode'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>subsystem</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='startupPolicy'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>default</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>mandatory</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>requisite</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>optional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='subsysType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>usb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pci</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>scsi</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='capsType'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='pciBackend'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </hostdev>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <rng supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>random</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>egd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>builtin</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </rng>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <filesystem supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='driverType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>path</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>handle</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtiofs</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </filesystem>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <tpm supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tpm-tis</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tpm-crb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>emulator</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>external</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendVersion'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>2.0</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </tpm>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <redirdev supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='bus'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>usb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </redirdev>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <channel supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pty</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>unix</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </channel>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <crypto supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>qemu</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>builtin</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </crypto>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <interface supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>default</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>passt</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </interface>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <panic supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>isa</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>hyperv</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </panic>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <console supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>null</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vc</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pty</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>dev</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>file</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pipe</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>stdio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>udp</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tcp</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>unix</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>qemu-vdagent</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>dbus</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </console>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </devices>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <features>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <gic supported='no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <genid supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <backup supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <async-teardown supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <ps2 supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <sev supported='no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <sgx supported='no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <hyperv supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='features'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>relaxed</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vapic</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>spinlocks</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vpindex</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>runtime</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>synic</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>stimer</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>reset</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vendor_id</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>frequencies</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>reenlightenment</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tlbflush</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>ipi</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>avic</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>emsr_bitmap</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>xmm_input</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <defaults>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </defaults>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </hyperv>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <launchSecurity supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='sectype'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tdx</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </launchSecurity>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </features>
Dec 02 23:42:20 compute-0 nova_compute[187243]: </domainCapabilities>
Dec 02 23:42:20 compute-0 nova_compute[187243]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:20 compute-0 nova_compute[187243]: 2025-12-02 23:42:19.967 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 02 23:42:20 compute-0 nova_compute[187243]: <domainCapabilities>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <domain>kvm</domain>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <arch>i686</arch>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <vcpu max='240'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <iothreads supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <os supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <enum name='firmware'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <loader supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>rom</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pflash</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='readonly'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>yes</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>no</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='secure'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>no</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </loader>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </os>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <cpu>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>on</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>off</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='maximumMigratable'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>on</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>off</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <vendor>AMD</vendor>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='succor'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <mode name='custom' supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cooperlake'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Denverton'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Denverton-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Denverton-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Denverton-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='GraniteRapids'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx10'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx10-128'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx10-256'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx10-512'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='IvyBridge'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='KnightsMill'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Opteron_G4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Opteron_G5'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SierraForest'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='athlon'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='athlon-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='core2duo'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='core2duo-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='coreduo'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='coreduo-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='n270'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='n270-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='phenom'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='phenom-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </cpu>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <memoryBacking supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <enum name='sourceType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>file</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>anonymous</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>memfd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </memoryBacking>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <devices>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <disk supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='diskDevice'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>disk</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>cdrom</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>floppy</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>lun</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='bus'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>ide</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>fdc</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>scsi</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>usb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>sata</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <graphics supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vnc</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>egl-headless</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>dbus</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </graphics>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <video supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='modelType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vga</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>cirrus</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>none</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>bochs</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>ramfb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </video>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <hostdev supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='mode'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>subsystem</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='startupPolicy'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>default</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>mandatory</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>requisite</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>optional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='subsysType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>usb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pci</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>scsi</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='capsType'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='pciBackend'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </hostdev>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <rng supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>random</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>egd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>builtin</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </rng>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <filesystem supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='driverType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>path</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>handle</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtiofs</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </filesystem>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <tpm supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tpm-tis</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tpm-crb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>emulator</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>external</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendVersion'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>2.0</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </tpm>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <redirdev supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='bus'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>usb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </redirdev>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <channel supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pty</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>unix</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </channel>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <crypto supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>qemu</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>builtin</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </crypto>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <interface supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>default</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>passt</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </interface>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <panic supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>isa</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>hyperv</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </panic>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <console supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>null</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vc</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pty</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>dev</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>file</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pipe</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>stdio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>udp</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tcp</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>unix</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>qemu-vdagent</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>dbus</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </console>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </devices>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <features>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <gic supported='no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <genid supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <backup supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <async-teardown supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <ps2 supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <sev supported='no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <sgx supported='no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <hyperv supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='features'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>relaxed</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vapic</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>spinlocks</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vpindex</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>runtime</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>synic</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>stimer</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>reset</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vendor_id</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>frequencies</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>reenlightenment</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tlbflush</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>ipi</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>avic</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>emsr_bitmap</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>xmm_input</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <defaults>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </defaults>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </hyperv>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <launchSecurity supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='sectype'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tdx</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </launchSecurity>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </features>
Dec 02 23:42:20 compute-0 nova_compute[187243]: </domainCapabilities>
Dec 02 23:42:20 compute-0 nova_compute[187243]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:20 compute-0 nova_compute[187243]: 2025-12-02 23:42:20.018 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Dec 02 23:42:20 compute-0 nova_compute[187243]: 2025-12-02 23:42:20.023 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 02 23:42:20 compute-0 nova_compute[187243]: <domainCapabilities>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <domain>kvm</domain>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <arch>x86_64</arch>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <vcpu max='4096'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <iothreads supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <os supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <enum name='firmware'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>efi</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <loader supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>rom</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pflash</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='readonly'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>yes</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>no</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='secure'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>yes</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>no</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </loader>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </os>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <cpu>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>on</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>off</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='maximumMigratable'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>on</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>off</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <vendor>AMD</vendor>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='succor'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <mode name='custom' supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cooperlake'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Denverton'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Denverton-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Denverton-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Denverton-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='GraniteRapids'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx10'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx10-128'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx10-256'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx10-512'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='IvyBridge'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='KnightsMill'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Opteron_G4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Opteron_G5'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SierraForest'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='athlon'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='athlon-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='core2duo'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='core2duo-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='coreduo'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='coreduo-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='n270'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='n270-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='phenom'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='phenom-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </cpu>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <memoryBacking supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <enum name='sourceType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>file</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>anonymous</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>memfd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </memoryBacking>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <devices>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <disk supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='diskDevice'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>disk</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>cdrom</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>floppy</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>lun</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='bus'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>fdc</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>scsi</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>usb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>sata</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <graphics supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vnc</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>egl-headless</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>dbus</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </graphics>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <video supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='modelType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vga</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>cirrus</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>none</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>bochs</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>ramfb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </video>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <hostdev supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='mode'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>subsystem</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='startupPolicy'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>default</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>mandatory</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>requisite</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>optional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='subsysType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>usb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pci</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>scsi</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='capsType'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='pciBackend'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </hostdev>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <rng supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>random</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>egd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>builtin</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </rng>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <filesystem supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='driverType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>path</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>handle</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtiofs</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </filesystem>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <tpm supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tpm-tis</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tpm-crb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>emulator</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>external</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendVersion'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>2.0</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </tpm>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <redirdev supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='bus'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>usb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </redirdev>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <channel supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pty</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>unix</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </channel>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <crypto supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>qemu</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>builtin</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </crypto>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <interface supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>default</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>passt</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </interface>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <panic supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>isa</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>hyperv</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </panic>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <console supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>null</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vc</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pty</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>dev</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>file</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pipe</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>stdio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>udp</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tcp</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>unix</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>qemu-vdagent</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>dbus</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </console>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </devices>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <features>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <gic supported='no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <genid supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <backup supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <async-teardown supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <ps2 supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <sev supported='no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <sgx supported='no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <hyperv supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='features'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>relaxed</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vapic</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>spinlocks</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vpindex</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>runtime</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>synic</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>stimer</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>reset</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vendor_id</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>frequencies</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>reenlightenment</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tlbflush</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>ipi</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>avic</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>emsr_bitmap</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>xmm_input</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <defaults>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </defaults>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </hyperv>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <launchSecurity supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='sectype'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tdx</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </launchSecurity>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </features>
Dec 02 23:42:20 compute-0 nova_compute[187243]: </domainCapabilities>
Dec 02 23:42:20 compute-0 nova_compute[187243]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:20 compute-0 nova_compute[187243]: 2025-12-02 23:42:20.077 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 02 23:42:20 compute-0 nova_compute[187243]: <domainCapabilities>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <path>/usr/libexec/qemu-kvm</path>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <domain>kvm</domain>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <arch>x86_64</arch>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <vcpu max='240'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <iothreads supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <os supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <enum name='firmware'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <loader supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>rom</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pflash</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='readonly'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>yes</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>no</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='secure'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>no</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </loader>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </os>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <cpu>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <mode name='host-passthrough' supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='hostPassthroughMigratable'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>on</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>off</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <mode name='maximum' supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='maximumMigratable'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>on</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>off</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <mode name='host-model' supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <vendor>AMD</vendor>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='x2apic'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='tsc-deadline'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='hypervisor'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='tsc_adjust'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='spec-ctrl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='stibp'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='ssbd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='cmp_legacy'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='overflow-recov'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='succor'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='ibrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='amd-ssbd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='virt-ssbd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='lbrv'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='tsc-scale'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='vmcb-clean'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='flushbyasid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='pause-filter'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='pfthreshold'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='svme-addr-chk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <feature policy='disable' name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <mode name='custom' supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-noTSX'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Broadwell-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cascadelake-Server-v5'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cooperlake'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cooperlake-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Cooperlake-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Denverton'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Denverton-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Denverton-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Denverton-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Dhyana-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Genoa'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Genoa-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='auto-ibrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Milan'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Milan-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Milan-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amd-psfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='no-nested-data-bp'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='null-sel-clr-base'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='stibp-always-on'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-Rome-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='EPYC-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='GraniteRapids'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='GraniteRapids-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='GraniteRapids-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx10'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx10-128'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx10-256'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx10-512'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='prefetchiti'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-noTSX'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Haswell-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-noTSX'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v5'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v6'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Icelake-Server-v7'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='IvyBridge'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='IvyBridge-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='IvyBridge-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='IvyBridge-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='KnightsMill'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='KnightsMill-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-4fmaps'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-4vnniw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512er'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512pf'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Opteron_G4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Opteron_G4-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Opteron_G5'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Opteron_G5-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fma4'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tbm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xop'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SapphireRapids-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='amx-tile'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-bf16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-fp16'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512-vpopcntdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bitalg'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vbmi2'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrc'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fzrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='la57'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='taa-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='tsx-ldtrk'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xfd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SierraForest'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='SierraForest-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ifma'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-ne-convert'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx-vnni-int8'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='bus-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cmpccxadd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fbsdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='fsrs'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ibrs-all'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mcdt-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pbrsb-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='psdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='sbdr-ssdp-no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='serialize'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vaes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='vpclmulqdq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Client-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='hle'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='rtm'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Skylake-Server-v5'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512bw'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512cd'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512dq'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512f'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='avx512vl'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='invpcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pcid'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='pku'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='mpx'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v2'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v3'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='core-capability'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='split-lock-detect'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='Snowridge-v4'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='cldemote'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='erms'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='gfni'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdir64b'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='movdiri'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='xsaves'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='athlon'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='athlon-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='core2duo'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='core2duo-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='coreduo'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='coreduo-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='n270'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='n270-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='ss'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='phenom'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <blockers model='phenom-v1'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnow'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <feature name='3dnowext'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </blockers>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </mode>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </cpu>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <memoryBacking supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <enum name='sourceType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>file</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>anonymous</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <value>memfd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </memoryBacking>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <devices>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <disk supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='diskDevice'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>disk</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>cdrom</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>floppy</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>lun</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='bus'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>ide</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>fdc</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>scsi</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>usb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>sata</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <graphics supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vnc</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>egl-headless</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>dbus</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </graphics>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <video supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='modelType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vga</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>cirrus</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>none</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>bochs</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>ramfb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </video>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <hostdev supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='mode'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>subsystem</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='startupPolicy'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>default</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>mandatory</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>requisite</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>optional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='subsysType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>usb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pci</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>scsi</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='capsType'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='pciBackend'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </hostdev>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <rng supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtio-non-transitional</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>random</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>egd</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>builtin</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </rng>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <filesystem supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='driverType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>path</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>handle</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>virtiofs</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </filesystem>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <tpm supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tpm-tis</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tpm-crb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>emulator</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>external</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendVersion'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>2.0</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </tpm>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <redirdev supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='bus'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>usb</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </redirdev>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <channel supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pty</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>unix</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </channel>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <crypto supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>qemu</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendModel'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>builtin</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </crypto>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <interface supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='backendType'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>default</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>passt</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </interface>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <panic supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='model'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>isa</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>hyperv</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </panic>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <console supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='type'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>null</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vc</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pty</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>dev</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>file</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>pipe</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>stdio</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>udp</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tcp</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>unix</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>qemu-vdagent</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>dbus</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </console>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </devices>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <features>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <gic supported='no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <vmcoreinfo supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <genid supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <backingStoreInput supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <backup supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <async-teardown supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <ps2 supported='yes'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <sev supported='no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <sgx supported='no'/>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <hyperv supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='features'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>relaxed</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vapic</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>spinlocks</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vpindex</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>runtime</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>synic</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>stimer</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>reset</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>vendor_id</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>frequencies</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>reenlightenment</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tlbflush</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>ipi</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>avic</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>emsr_bitmap</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>xmm_input</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <defaults>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <spinlocks>4095</spinlocks>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <stimer_direct>on</stimer_direct>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <tlbflush_direct>on</tlbflush_direct>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <tlbflush_extended>on</tlbflush_extended>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </defaults>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </hyperv>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     <launchSecurity supported='yes'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       <enum name='sectype'>
Dec 02 23:42:20 compute-0 nova_compute[187243]:         <value>tdx</value>
Dec 02 23:42:20 compute-0 nova_compute[187243]:       </enum>
Dec 02 23:42:20 compute-0 nova_compute[187243]:     </launchSecurity>
Dec 02 23:42:20 compute-0 nova_compute[187243]:   </features>
Dec 02 23:42:20 compute-0 nova_compute[187243]: </domainCapabilities>
Dec 02 23:42:20 compute-0 nova_compute[187243]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 02 23:42:20 compute-0 nova_compute[187243]: 2025-12-02 23:42:20.137 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Dec 02 23:42:20 compute-0 nova_compute[187243]: 2025-12-02 23:42:20.137 187247 INFO nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Secure Boot support detected
Dec 02 23:42:20 compute-0 nova_compute[187243]: 2025-12-02 23:42:20.142 187247 INFO nova.virt.libvirt.driver [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 02 23:42:20 compute-0 nova_compute[187243]: 2025-12-02 23:42:20.142 187247 INFO nova.virt.libvirt.driver [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 02 23:42:20 compute-0 nova_compute[187243]: 2025-12-02 23:42:20.157 187247 DEBUG nova.virt.libvirt.driver [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] cpu compare xml: <cpu match="exact">
Dec 02 23:42:20 compute-0 nova_compute[187243]:   <model>Nehalem</model>
Dec 02 23:42:20 compute-0 nova_compute[187243]: </cpu>
Dec 02 23:42:20 compute-0 nova_compute[187243]:  _compare_cpu /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10922
Dec 02 23:42:20 compute-0 nova_compute[187243]: 2025-12-02 23:42:20.159 187247 DEBUG nova.virt.libvirt.driver [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Dec 02 23:42:20 compute-0 nova_compute[187243]: 2025-12-02 23:42:20.444 187247 WARNING nova.virt.libvirt.driver [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 02 23:42:20 compute-0 nova_compute[187243]: 2025-12-02 23:42:20.445 187247 DEBUG nova.virt.libvirt.volume.mount [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 02 23:42:20 compute-0 nova_compute[187243]: 2025-12-02 23:42:20.670 187247 INFO nova.virt.node [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Determined node identity 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 from /var/lib/nova/compute_id
Dec 02 23:42:21 compute-0 podman[187538]: 2025-12-02 23:42:21.138887936 +0000 UTC m=+0.087082205 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Dec 02 23:42:21 compute-0 nova_compute[187243]: 2025-12-02 23:42:21.182 187247 WARNING nova.compute.manager [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Compute nodes ['0d6e1fe8-f800-4b94-a0c0-ea75083d5248'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 02 23:42:22 compute-0 nova_compute[187243]: 2025-12-02 23:42:22.195 187247 INFO nova.compute.manager [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 02 23:42:23 compute-0 nova_compute[187243]: 2025-12-02 23:42:23.330 187247 WARNING nova.compute.manager [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 02 23:42:23 compute-0 nova_compute[187243]: 2025-12-02 23:42:23.331 187247 DEBUG oslo_concurrency.lockutils [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:42:23 compute-0 nova_compute[187243]: 2025-12-02 23:42:23.331 187247 DEBUG oslo_concurrency.lockutils [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:42:23 compute-0 nova_compute[187243]: 2025-12-02 23:42:23.332 187247 DEBUG oslo_concurrency.lockutils [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:42:23 compute-0 nova_compute[187243]: 2025-12-02 23:42:23.332 187247 DEBUG nova.compute.resource_tracker [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:42:23 compute-0 sshd-session[187558]: Accepted publickey for zuul from 192.168.122.30 port 33720 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 02 23:42:23 compute-0 systemd-logind[795]: New session 26 of user zuul.
Dec 02 23:42:23 compute-0 systemd[1]: Started Session 26 of User zuul.
Dec 02 23:42:23 compute-0 sshd-session[187558]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:42:23 compute-0 nova_compute[187243]: 2025-12-02 23:42:23.542 187247 WARNING nova.virt.libvirt.driver [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:42:23 compute-0 nova_compute[187243]: 2025-12-02 23:42:23.543 187247 DEBUG oslo_concurrency.processutils [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:42:23 compute-0 nova_compute[187243]: 2025-12-02 23:42:23.584 187247 DEBUG oslo_concurrency.processutils [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:42:23 compute-0 nova_compute[187243]: 2025-12-02 23:42:23.584 187247 DEBUG nova.compute.resource_tracker [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6195MB free_disk=73.36831283569336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:42:23 compute-0 nova_compute[187243]: 2025-12-02 23:42:23.585 187247 DEBUG oslo_concurrency.lockutils [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:42:23 compute-0 nova_compute[187243]: 2025-12-02 23:42:23.586 187247 DEBUG oslo_concurrency.lockutils [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:42:24 compute-0 nova_compute[187243]: 2025-12-02 23:42:24.094 187247 WARNING nova.compute.resource_tracker [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] No compute node record for compute-0.ctlplane.example.com:0d6e1fe8-f800-4b94-a0c0-ea75083d5248: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 could not be found.
Dec 02 23:42:24 compute-0 python3.9[187712]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 02 23:42:24 compute-0 nova_compute[187243]: 2025-12-02 23:42:24.602 187247 INFO nova.compute.resource_tracker [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248
Dec 02 23:42:26 compute-0 sudo[187866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aunlryijyubflfsyaejprfnsvffwadwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718945.3927824-52-21238419954320/AnsiballZ_systemd_service.py'
Dec 02 23:42:26 compute-0 sudo[187866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:26 compute-0 nova_compute[187243]: 2025-12-02 23:42:26.129 187247 DEBUG nova.compute.resource_tracker [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:42:26 compute-0 nova_compute[187243]: 2025-12-02 23:42:26.129 187247 DEBUG nova.compute.resource_tracker [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:42:23 up 50 min,  0 user,  load average: 0.82, 0.81, 0.62\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:42:26 compute-0 python3.9[187868]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:42:26 compute-0 systemd[1]: Reloading.
Dec 02 23:42:26 compute-0 systemd-rc-local-generator[187896]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:42:26 compute-0 systemd-sysv-generator[187900]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:42:26 compute-0 sudo[187866]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:27 compute-0 nova_compute[187243]: 2025-12-02 23:42:27.076 187247 INFO nova.scheduler.client.report [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] [req-3e79cf5f-3f6d-488a-a603-2257b82d3393] Created resource provider record via placement API for resource provider with UUID 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 and name compute-0.ctlplane.example.com.
Dec 02 23:42:27 compute-0 nova_compute[187243]: 2025-12-02 23:42:27.096 187247 DEBUG nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 02 23:42:27 compute-0 nova_compute[187243]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Dec 02 23:42:27 compute-0 nova_compute[187243]: 2025-12-02 23:42:27.096 187247 INFO nova.virt.libvirt.host [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] kernel doesn't support AMD SEV
Dec 02 23:42:27 compute-0 nova_compute[187243]: 2025-12-02 23:42:27.096 187247 DEBUG nova.compute.provider_tree [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:42:27 compute-0 nova_compute[187243]: 2025-12-02 23:42:27.097 187247 DEBUG nova.virt.libvirt.driver [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:42:27 compute-0 nova_compute[187243]: 2025-12-02 23:42:27.099 187247 DEBUG nova.virt.libvirt.driver [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Libvirt baseline CPU <cpu>
Dec 02 23:42:27 compute-0 nova_compute[187243]:   <arch>x86_64</arch>
Dec 02 23:42:27 compute-0 nova_compute[187243]:   <model>Nehalem</model>
Dec 02 23:42:27 compute-0 nova_compute[187243]:   <vendor>AMD</vendor>
Dec 02 23:42:27 compute-0 nova_compute[187243]:   <topology sockets="8" cores="1" threads="1"/>
Dec 02 23:42:27 compute-0 nova_compute[187243]:   <maxphysaddr mode="emulate" bits="40"/>
Dec 02 23:42:27 compute-0 nova_compute[187243]: </cpu>
Dec 02 23:42:27 compute-0 nova_compute[187243]:  _get_guest_baseline_cpu_features /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13545
Dec 02 23:42:27 compute-0 python3.9[188053]: ansible-ansible.builtin.service_facts Invoked
Dec 02 23:42:27 compute-0 network[188070]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 02 23:42:27 compute-0 network[188071]: 'network-scripts' will be removed from distribution in near future.
Dec 02 23:42:27 compute-0 network[188072]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 02 23:42:27 compute-0 nova_compute[187243]: 2025-12-02 23:42:27.670 187247 DEBUG nova.scheduler.client.report [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Updated inventory for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Dec 02 23:42:27 compute-0 nova_compute[187243]: 2025-12-02 23:42:27.670 187247 DEBUG nova.compute.provider_tree [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Updating resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 02 23:42:27 compute-0 nova_compute[187243]: 2025-12-02 23:42:27.670 187247 DEBUG nova.compute.provider_tree [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:42:27 compute-0 nova_compute[187243]: 2025-12-02 23:42:27.834 187247 DEBUG nova.compute.provider_tree [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Updating resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 02 23:42:28 compute-0 nova_compute[187243]: 2025-12-02 23:42:28.343 187247 DEBUG nova.compute.resource_tracker [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:42:28 compute-0 nova_compute[187243]: 2025-12-02 23:42:28.344 187247 DEBUG oslo_concurrency.lockutils [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.757s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:42:28 compute-0 nova_compute[187243]: 2025-12-02 23:42:28.344 187247 DEBUG nova.service [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Dec 02 23:42:28 compute-0 nova_compute[187243]: 2025-12-02 23:42:28.477 187247 DEBUG nova.service [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Dec 02 23:42:28 compute-0 nova_compute[187243]: 2025-12-02 23:42:28.478 187247 DEBUG nova.servicegroup.drivers.db [None req-fba229da-eb22-4cfc-be9e-b04137647290 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Dec 02 23:42:32 compute-0 sudo[188344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyqzzsjtyhxgaocrrzifrpsmkaqbcwjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718952.51651-90-111202448448486/AnsiballZ_systemd_service.py'
Dec 02 23:42:32 compute-0 sudo[188344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:33 compute-0 python3.9[188346]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:42:33 compute-0 sudo[188344]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:34 compute-0 sudo[188497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufldepvmeppyocuoxtibaallbzyrbvgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718953.6783662-110-230345907745037/AnsiballZ_file.py'
Dec 02 23:42:34 compute-0 sudo[188497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:34 compute-0 python3.9[188499]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:34 compute-0 sudo[188497]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:34 compute-0 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 02 23:42:35 compute-0 sudo[188650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shsgvpmijxprbxonhbzvfdocswkmrmzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718954.6526747-126-60137421811929/AnsiballZ_file.py'
Dec 02 23:42:35 compute-0 sudo[188650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:35 compute-0 python3.9[188652]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:35 compute-0 sudo[188650]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:36 compute-0 podman[188749]: 2025-12-02 23:42:36.141983831 +0000 UTC m=+0.077620501 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 02 23:42:36 compute-0 podman[188753]: 2025-12-02 23:42:36.198519511 +0000 UTC m=+0.136014081 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 02 23:42:36 compute-0 sudo[188846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfrzoeviknfoffhupuaqibvobnnsaqbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718955.7258835-144-235287472921409/AnsiballZ_command.py'
Dec 02 23:42:36 compute-0 sudo[188846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:36 compute-0 python3.9[188848]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:42:36 compute-0 sudo[188846]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:37 compute-0 python3.9[189000]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 23:42:38 compute-0 sudo[189150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paycyotxyxukxbrtiitciotivjxjhdyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718957.8338475-180-183793693323490/AnsiballZ_systemd_service.py'
Dec 02 23:42:38 compute-0 sudo[189150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:38 compute-0 python3.9[189152]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:42:38 compute-0 systemd[1]: Reloading.
Dec 02 23:42:38 compute-0 systemd-rc-local-generator[189180]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:42:38 compute-0 systemd-sysv-generator[189184]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:42:38 compute-0 sudo[189150]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:39 compute-0 sudo[189338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nznavxnfqmvggyxlbblysrbgijecegtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718959.177254-196-244010243503951/AnsiballZ_command.py'
Dec 02 23:42:39 compute-0 sudo[189338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:39 compute-0 python3.9[189340]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:42:39 compute-0 sudo[189338]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:40 compute-0 sudo[189491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fysugzjubunhsmyxjpztnbjxdqgdsbha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718960.2327921-214-120169270677589/AnsiballZ_file.py'
Dec 02 23:42:40 compute-0 sudo[189491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:40 compute-0 python3.9[189493]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:42:40 compute-0 sudo[189491]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:41 compute-0 python3.9[189643]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:42 compute-0 python3.9[189795]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:43 compute-0 python3.9[189916]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718962.1170142-246-4775175711935/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:42:44 compute-0 sudo[190066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svzcomrneaiulizgbddxedkoewqyzvxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718963.759347-276-45444789567913/AnsiballZ_group.py'
Dec 02 23:42:44 compute-0 sudo[190066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:44 compute-0 python3.9[190068]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec 02 23:42:44 compute-0 sudo[190066]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:45 compute-0 sudo[190218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfolrrqmhogxcimwtzxfmvljlhibwyri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718964.9542155-298-236665623290294/AnsiballZ_getent.py'
Dec 02 23:42:45 compute-0 sudo[190218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:45 compute-0 python3.9[190220]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec 02 23:42:45 compute-0 sudo[190218]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:46 compute-0 sudo[190371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qaljigbfvzjqlffqtwgwyhjnoeolfggr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718966.0210698-314-178305156047045/AnsiballZ_group.py'
Dec 02 23:42:46 compute-0 sudo[190371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:46 compute-0 python3.9[190373]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 02 23:42:46 compute-0 groupadd[190374]: group added to /etc/group: name=ceilometer, GID=42405
Dec 02 23:42:46 compute-0 groupadd[190374]: group added to /etc/gshadow: name=ceilometer
Dec 02 23:42:46 compute-0 groupadd[190374]: new group: name=ceilometer, GID=42405
Dec 02 23:42:46 compute-0 sudo[190371]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:47 compute-0 sudo[190529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtcepabyutqpgwuhwpwnzgwpijvqntzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718967.0704312-330-227809552520192/AnsiballZ_user.py'
Dec 02 23:42:47 compute-0 sudo[190529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:42:47 compute-0 python3.9[190531]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 02 23:42:47 compute-0 useradd[190533]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Dec 02 23:42:47 compute-0 useradd[190533]: add 'ceilometer' to group 'libvirt'
Dec 02 23:42:47 compute-0 useradd[190533]: add 'ceilometer' to shadow group 'libvirt'
Dec 02 23:42:48 compute-0 sudo[190529]: pam_unix(sudo:session): session closed for user root
Dec 02 23:42:49 compute-0 python3.9[190689]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:50 compute-0 python3.9[190810]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764718969.266747-382-151548634961410/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:51 compute-0 python3.9[190960]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:51 compute-0 podman[191055]: 2025-12-02 23:42:51.488632927 +0000 UTC m=+0.093455415 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 02 23:42:51 compute-0 python3.9[191098]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764718970.5561814-382-164609563630696/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:52 compute-0 python3.9[191251]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:52 compute-0 python3.9[191372]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764718971.8536358-382-85208546731708/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:53 compute-0 python3.9[191522]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:54 compute-0 python3.9[191674]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:42:55 compute-0 python3.9[191826]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:56 compute-0 python3.9[191947]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718974.9939456-500-270019112179988/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:56 compute-0 python3.9[192097]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:57 compute-0 python3.9[192173]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:58 compute-0 python3.9[192323]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:58 compute-0 python3.9[192444]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718977.5948503-500-274945107422845/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=3a381808a650224f9d664cc68513cbbb45330072 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:42:59 compute-0 python3.9[192594]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:42:59 compute-0 python3.9[192715]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718978.8579216-500-130780460532144/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:00 compute-0 python3.9[192865]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:43:00.658 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:43:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:43:00.658 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:43:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:43:00.659 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:43:00 compute-0 sshd-session[192909]: error: kex_exchange_identification: read: Connection reset by peer
Dec 02 23:43:00 compute-0 sshd-session[192909]: Connection reset by 45.140.17.97 port 32462
Dec 02 23:43:01 compute-0 python3.9[192989]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718980.1102393-500-92293460475153/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:01 compute-0 python3.9[193139]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:02 compute-0 python3.9[193260]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718981.434055-500-217351280252382/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:03 compute-0 python3.9[193410]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:03 compute-0 python3.9[193533]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718982.6571274-500-153717288327687/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:04 compute-0 python3.9[193683]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:05 compute-0 python3.9[193804]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718983.87482-500-156793875256961/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:05 compute-0 python3.9[193954]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:06 compute-0 podman[194049]: 2025-12-02 23:43:06.353374279 +0000 UTC m=+0.075709788 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 02 23:43:06 compute-0 podman[194050]: 2025-12-02 23:43:06.426542342 +0000 UTC m=+0.136372939 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 02 23:43:06 compute-0 python3.9[194107]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718985.2490232-500-192112341325902/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:07 compute-0 python3.9[194270]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:07 compute-0 python3.9[194391]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718986.7495847-500-142772679975659/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:08 compute-0 python3.9[194541]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:09 compute-0 python3.9[194662]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764718988.0517297-500-194223459755166/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:10 compute-0 sshd-session[193434]: Received disconnect from 45.78.219.213 port 57450:11: Bye Bye [preauth]
Dec 02 23:43:10 compute-0 sshd-session[193434]: Disconnected from 45.78.219.213 port 57450 [preauth]
Dec 02 23:43:11 compute-0 python3.9[194812]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:12 compute-0 python3.9[194888]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:13 compute-0 python3.9[195038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:13 compute-0 python3.9[195114]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:14 compute-0 python3.9[195264]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:15 compute-0 python3.9[195340]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:15 compute-0 sudo[195490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwlbhikvggzufkrihfdckpkkgnlyqqao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718995.3710673-878-211160003212753/AnsiballZ_file.py'
Dec 02 23:43:15 compute-0 sudo[195490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:15 compute-0 python3.9[195492]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:15 compute-0 sudo[195490]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:16 compute-0 sudo[195642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxqhnwwfkdoylaxyqmwytkxmrpldhrwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718996.225305-894-8267078964460/AnsiballZ_file.py'
Dec 02 23:43:16 compute-0 sudo[195642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:16 compute-0 python3.9[195644]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:16 compute-0 sudo[195642]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:17 compute-0 sudo[195794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anqbzqlbimrmtgujetaotnkxncjetlih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718997.116162-910-212605154805500/AnsiballZ_file.py'
Dec 02 23:43:17 compute-0 sudo[195794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:17 compute-0 python3.9[195796]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:43:17 compute-0 sudo[195794]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:18 compute-0 sudo[195946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpxfemjzejlkmgqrpgiqbjpsbkjdlqtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718997.9924731-926-100701157434112/AnsiballZ_systemd_service.py'
Dec 02 23:43:18 compute-0 sudo[195946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:18 compute-0 python3.9[195948]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:43:18 compute-0 systemd[1]: Reloading.
Dec 02 23:43:18 compute-0 systemd-rc-local-generator[195972]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:43:18 compute-0 systemd-sysv-generator[195978]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:43:19 compute-0 systemd[1]: Listening on Podman API Socket.
Dec 02 23:43:19 compute-0 sudo[195946]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:19 compute-0 sudo[196136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mejqtonwjhrtfdornjmmonadiycjpivx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718999.6165905-944-18889081702092/AnsiballZ_stat.py'
Dec 02 23:43:19 compute-0 sudo[196136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:20 compute-0 python3.9[196138]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:20 compute-0 sudo[196136]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:20 compute-0 sudo[196259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujfbdcezrbudialjvxlenkqvcztvmsnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764718999.6165905-944-18889081702092/AnsiballZ_copy.py'
Dec 02 23:43:20 compute-0 sudo[196259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:20 compute-0 python3.9[196261]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764718999.6165905-944-18889081702092/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:43:20 compute-0 sudo[196259]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:21 compute-0 nova_compute[187243]: 2025-12-02 23:43:21.480 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:21 compute-0 nova_compute[187243]: 2025-12-02 23:43:21.482 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:21 compute-0 nova_compute[187243]: 2025-12-02 23:43:21.483 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:21 compute-0 nova_compute[187243]: 2025-12-02 23:43:21.483 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:21 compute-0 nova_compute[187243]: 2025-12-02 23:43:21.484 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:21 compute-0 nova_compute[187243]: 2025-12-02 23:43:21.484 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:21 compute-0 nova_compute[187243]: 2025-12-02 23:43:21.485 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:21 compute-0 sudo[196423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xapfkpeckvnxjoztxxcqcqsdhnyngola ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719001.1369565-978-264000829885842/AnsiballZ_container_config_data.py'
Dec 02 23:43:21 compute-0 sudo[196423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:21 compute-0 podman[196385]: 2025-12-02 23:43:21.69898016 +0000 UTC m=+0.081978129 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Dec 02 23:43:21 compute-0 python3.9[196427]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec 02 23:43:21 compute-0 sudo[196423]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:21 compute-0 nova_compute[187243]: 2025-12-02 23:43:21.996 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:21 compute-0 nova_compute[187243]: 2025-12-02 23:43:21.997 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:43:21 compute-0 nova_compute[187243]: 2025-12-02 23:43:21.997 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:22 compute-0 nova_compute[187243]: 2025-12-02 23:43:22.509 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:43:22 compute-0 nova_compute[187243]: 2025-12-02 23:43:22.509 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:43:22 compute-0 nova_compute[187243]: 2025-12-02 23:43:22.509 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:43:22 compute-0 nova_compute[187243]: 2025-12-02 23:43:22.509 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:43:22 compute-0 nova_compute[187243]: 2025-12-02 23:43:22.668 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:43:22 compute-0 nova_compute[187243]: 2025-12-02 23:43:22.670 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:43:22 compute-0 nova_compute[187243]: 2025-12-02 23:43:22.689 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:43:22 compute-0 nova_compute[187243]: 2025-12-02 23:43:22.690 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6144MB free_disk=73.3683090209961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:43:22 compute-0 nova_compute[187243]: 2025-12-02 23:43:22.690 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:43:22 compute-0 nova_compute[187243]: 2025-12-02 23:43:22.690 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:43:22 compute-0 sudo[196583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqmksdliqluhcskccwirfiofueoqslfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719002.2219903-996-217911467604730/AnsiballZ_container_config_hash.py'
Dec 02 23:43:22 compute-0 sudo[196583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:22 compute-0 python3.9[196585]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:43:22 compute-0 sudo[196583]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:23 compute-0 nova_compute[187243]: 2025-12-02 23:43:23.849 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:43:23 compute-0 nova_compute[187243]: 2025-12-02 23:43:23.849 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:43:22 up 51 min,  0 user,  load average: 0.83, 0.81, 0.64\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:43:23 compute-0 nova_compute[187243]: 2025-12-02 23:43:23.920 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:43:23 compute-0 sudo[196735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arwbzihaifpejnwkmaduykqjvidnvjqm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764719003.3962708-1016-233520650701017/AnsiballZ_edpm_container_manage.py'
Dec 02 23:43:23 compute-0 sudo[196735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:24 compute-0 python3[196738]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:43:24 compute-0 nova_compute[187243]: 2025-12-02 23:43:24.426 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:43:24 compute-0 nova_compute[187243]: 2025-12-02 23:43:24.936 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:43:24 compute-0 nova_compute[187243]: 2025-12-02 23:43:24.936 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.246s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:43:24 compute-0 nova_compute[187243]: 2025-12-02 23:43:24.937 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:43:25 compute-0 sshd-session[196736]: Invalid user aj from 49.247.36.49 port 15787
Dec 02 23:43:25 compute-0 sshd-session[196736]: Received disconnect from 49.247.36.49 port 15787:11: Bye Bye [preauth]
Dec 02 23:43:25 compute-0 sshd-session[196736]: Disconnected from invalid user aj 49.247.36.49 port 15787 [preauth]
Dec 02 23:43:25 compute-0 podman[196751]: 2025-12-02 23:43:25.585966958 +0000 UTC m=+1.340600179 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 02 23:43:25 compute-0 podman[196848]: 2025-12-02 23:43:25.732458159 +0000 UTC m=+0.059044491 container create 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Dec 02 23:43:25 compute-0 podman[196848]: 2025-12-02 23:43:25.710145805 +0000 UTC m=+0.036732167 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 02 23:43:25 compute-0 python3[196738]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Dec 02 23:43:25 compute-0 sudo[196735]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:26 compute-0 sudo[197035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aldtdlzetkcbolzsmgvaoqfqxpmvzwbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719006.3068693-1032-26890140095116/AnsiballZ_stat.py'
Dec 02 23:43:26 compute-0 sudo[197035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:26 compute-0 python3.9[197037]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:43:26 compute-0 sudo[197035]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:27 compute-0 sudo[197189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvkfwodkoqlqphtjjhbckiklyccmtsvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719007.3248868-1050-56266667024187/AnsiballZ_file.py'
Dec 02 23:43:27 compute-0 sudo[197189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:27 compute-0 python3.9[197191]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:27 compute-0 sudo[197189]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:28 compute-0 sudo[197340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbarricusfgzlcgxmqpjobffspogkowy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719007.954279-1050-187104002466210/AnsiballZ_copy.py'
Dec 02 23:43:28 compute-0 sudo[197340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:28 compute-0 python3.9[197342]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764719007.954279-1050-187104002466210/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:28 compute-0 sudo[197340]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:29 compute-0 sudo[197416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqvxvazcqxnmztqtftmkjsmvoiqahxge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719007.954279-1050-187104002466210/AnsiballZ_systemd.py'
Dec 02 23:43:29 compute-0 sudo[197416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:29 compute-0 python3.9[197418]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:43:29 compute-0 systemd[1]: Reloading.
Dec 02 23:43:29 compute-0 systemd-rc-local-generator[197450]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:43:29 compute-0 systemd-sysv-generator[197454]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:43:29 compute-0 sudo[197416]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:30 compute-0 sudo[197530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdfgnqocsajzzwacazwgdixcopxfntfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719007.954279-1050-187104002466210/AnsiballZ_systemd.py'
Dec 02 23:43:30 compute-0 sudo[197530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:30 compute-0 python3.9[197532]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:43:30 compute-0 systemd[1]: Reloading.
Dec 02 23:43:30 compute-0 systemd-rc-local-generator[197556]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:43:30 compute-0 systemd-sysv-generator[197562]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:43:30 compute-0 systemd[1]: Starting podman_exporter container...
Dec 02 23:43:31 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:43:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b437c8b93cb66aede7fc1f01a6400c66da52b1dad13943fb5abe11e37f59374/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b437c8b93cb66aede7fc1f01a6400c66da52b1dad13943fb5abe11e37f59374/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:31 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d.
Dec 02 23:43:31 compute-0 podman[197572]: 2025-12-02 23:43:31.097945798 +0000 UTC m=+0.160810181 container init 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:43:31 compute-0 podman_exporter[197588]: ts=2025-12-02T23:43:31.122Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 02 23:43:31 compute-0 podman[197572]: 2025-12-02 23:43:31.122602189 +0000 UTC m=+0.185466492 container start 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:43:31 compute-0 podman_exporter[197588]: ts=2025-12-02T23:43:31.122Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 02 23:43:31 compute-0 podman_exporter[197588]: ts=2025-12-02T23:43:31.122Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 02 23:43:31 compute-0 podman_exporter[197588]: ts=2025-12-02T23:43:31.122Z caller=handler.go:105 level=info collector=container
Dec 02 23:43:31 compute-0 podman[197572]: podman_exporter
Dec 02 23:43:31 compute-0 systemd[1]: Starting Podman API Service...
Dec 02 23:43:31 compute-0 systemd[1]: Started Podman API Service.
Dec 02 23:43:31 compute-0 systemd[1]: Started podman_exporter container.
Dec 02 23:43:31 compute-0 sudo[197530]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:31 compute-0 podman[197600]: time="2025-12-02T23:43:31Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec 02 23:43:31 compute-0 podman[197600]: time="2025-12-02T23:43:31Z" level=info msg="Setting parallel job count to 25"
Dec 02 23:43:31 compute-0 podman[197600]: time="2025-12-02T23:43:31Z" level=info msg="Using sqlite as database backend"
Dec 02 23:43:31 compute-0 podman[197600]: time="2025-12-02T23:43:31Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec 02 23:43:31 compute-0 podman[197600]: time="2025-12-02T23:43:31Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec 02 23:43:31 compute-0 podman[197600]: time="2025-12-02T23:43:31Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Dec 02 23:43:31 compute-0 podman[197600]: @ - - [02/Dec/2025:23:43:31 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 02 23:43:31 compute-0 podman[197600]: time="2025-12-02T23:43:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:43:31 compute-0 podman[197597]: 2025-12-02 23:43:31.1977049 +0000 UTC m=+0.063169571 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:43:31 compute-0 podman[197600]: @ - - [02/Dec/2025:23:43:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14040 "" "Go-http-client/1.1"
Dec 02 23:43:31 compute-0 systemd[1]: 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d-1845338dc090a3a0.service: Main process exited, code=exited, status=1/FAILURE
Dec 02 23:43:31 compute-0 systemd[1]: 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d-1845338dc090a3a0.service: Failed with result 'exit-code'.
Dec 02 23:43:31 compute-0 podman_exporter[197588]: ts=2025-12-02T23:43:31.208Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 02 23:43:31 compute-0 podman_exporter[197588]: ts=2025-12-02T23:43:31.208Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 02 23:43:31 compute-0 podman_exporter[197588]: ts=2025-12-02T23:43:31.209Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 02 23:43:31 compute-0 sudo[197785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekipdmxgtjyxwzkhjzxqlevzqftsqzke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719011.436032-1098-216419530265446/AnsiballZ_systemd.py'
Dec 02 23:43:31 compute-0 sudo[197785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:32 compute-0 python3.9[197787]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:43:32 compute-0 systemd[1]: Stopping podman_exporter container...
Dec 02 23:43:32 compute-0 podman[197600]: @ - - [02/Dec/2025:23:43:31 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Dec 02 23:43:32 compute-0 systemd[1]: libpod-28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d.scope: Deactivated successfully.
Dec 02 23:43:32 compute-0 podman[197791]: 2025-12-02 23:43:32.223068894 +0000 UTC m=+0.048584625 container died 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:43:32 compute-0 systemd[1]: 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d-1845338dc090a3a0.timer: Deactivated successfully.
Dec 02 23:43:32 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d.
Dec 02 23:43:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d-userdata-shm.mount: Deactivated successfully.
Dec 02 23:43:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-4b437c8b93cb66aede7fc1f01a6400c66da52b1dad13943fb5abe11e37f59374-merged.mount: Deactivated successfully.
Dec 02 23:43:32 compute-0 podman[197791]: 2025-12-02 23:43:32.576385296 +0000 UTC m=+0.401901027 container cleanup 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:43:32 compute-0 podman[197791]: podman_exporter
Dec 02 23:43:32 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 02 23:43:32 compute-0 podman[197821]: podman_exporter
Dec 02 23:43:32 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec 02 23:43:32 compute-0 systemd[1]: Stopped podman_exporter container.
Dec 02 23:43:32 compute-0 systemd[1]: Starting podman_exporter container...
Dec 02 23:43:32 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:43:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b437c8b93cb66aede7fc1f01a6400c66da52b1dad13943fb5abe11e37f59374/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b437c8b93cb66aede7fc1f01a6400c66da52b1dad13943fb5abe11e37f59374/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:32 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d.
Dec 02 23:43:32 compute-0 podman[197832]: 2025-12-02 23:43:32.871415768 +0000 UTC m=+0.157212553 container init 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:43:32 compute-0 podman_exporter[197847]: ts=2025-12-02T23:43:32.886Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 02 23:43:32 compute-0 podman_exporter[197847]: ts=2025-12-02T23:43:32.887Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 02 23:43:32 compute-0 podman_exporter[197847]: ts=2025-12-02T23:43:32.887Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 02 23:43:32 compute-0 podman_exporter[197847]: ts=2025-12-02T23:43:32.887Z caller=handler.go:105 level=info collector=container
Dec 02 23:43:32 compute-0 podman[197600]: @ - - [02/Dec/2025:23:43:32 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 02 23:43:32 compute-0 podman[197600]: time="2025-12-02T23:43:32Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:43:32 compute-0 podman[197832]: 2025-12-02 23:43:32.911507885 +0000 UTC m=+0.197304600 container start 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:43:32 compute-0 podman[197832]: podman_exporter
Dec 02 23:43:32 compute-0 podman[197600]: @ - - [02/Dec/2025:23:43:32 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14042 "" "Go-http-client/1.1"
Dec 02 23:43:32 compute-0 podman_exporter[197847]: ts=2025-12-02T23:43:32.920Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 02 23:43:32 compute-0 podman_exporter[197847]: ts=2025-12-02T23:43:32.920Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 02 23:43:32 compute-0 podman_exporter[197847]: ts=2025-12-02T23:43:32.921Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 02 23:43:32 compute-0 systemd[1]: Started podman_exporter container.
Dec 02 23:43:32 compute-0 sudo[197785]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:33 compute-0 podman[197857]: 2025-12-02 23:43:33.014308411 +0000 UTC m=+0.095434007 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:43:33 compute-0 sudo[198031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inrbepbxfbnuqewwekwbbuemnoumtqqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719013.5007658-1114-256378222643988/AnsiballZ_stat.py'
Dec 02 23:43:33 compute-0 sudo[198031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:34 compute-0 python3.9[198033]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:43:34 compute-0 sudo[198031]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:34 compute-0 sudo[198154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsincvzwcrywtpnhjjcdvrnbjeyeytld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719013.5007658-1114-256378222643988/AnsiballZ_copy.py'
Dec 02 23:43:34 compute-0 sudo[198154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:34 compute-0 python3.9[198156]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764719013.5007658-1114-256378222643988/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 02 23:43:34 compute-0 sudo[198154]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:35 compute-0 sudo[198306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvszsjpxzdbbwraujzufqqhobowrivfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719015.239957-1148-6572917016944/AnsiballZ_container_config_data.py'
Dec 02 23:43:35 compute-0 sudo[198306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:35 compute-0 python3.9[198308]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec 02 23:43:35 compute-0 sudo[198306]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:36 compute-0 podman[198432]: 2025-12-02 23:43:36.56076441 +0000 UTC m=+0.073604116 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 23:43:36 compute-0 sudo[198489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdbwhligcptnimuxtlgzhfbwivxthdxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719016.1768148-1166-28521320941111/AnsiballZ_container_config_hash.py'
Dec 02 23:43:36 compute-0 sudo[198489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:36 compute-0 podman[198433]: 2025-12-02 23:43:36.641756804 +0000 UTC m=+0.141456479 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 02 23:43:36 compute-0 python3.9[198495]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 02 23:43:36 compute-0 sudo[198489]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:37 compute-0 sudo[198654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etsodmivccmndxezmzyeoosvrshrdhwn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764719017.2603056-1186-18627677407197/AnsiballZ_edpm_container_manage.py'
Dec 02 23:43:37 compute-0 sudo[198654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:37 compute-0 python3[198656]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 02 23:43:40 compute-0 podman[198669]: 2025-12-02 23:43:40.444924249 +0000 UTC m=+2.462133677 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 02 23:43:40 compute-0 podman[198764]: 2025-12-02 23:43:40.584925802 +0000 UTC m=+0.042213040 container create b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, container_name=openstack_network_exporter)
Dec 02 23:43:40 compute-0 podman[198764]: 2025-12-02 23:43:40.564038843 +0000 UTC m=+0.021326101 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 02 23:43:40 compute-0 python3[198656]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 02 23:43:40 compute-0 sudo[198654]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:41 compute-0 sudo[198954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cywtzlbbtoecpfrpreoorimqdiowygtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719021.0559845-1202-172771901244374/AnsiballZ_stat.py'
Dec 02 23:43:41 compute-0 sudo[198954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:41 compute-0 python3.9[198956]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:43:41 compute-0 sudo[198954]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:42 compute-0 sudo[199108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tolgpcgfjxrutxmihtdkhtzgnfvowlbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719022.084823-1220-143020334459799/AnsiballZ_file.py'
Dec 02 23:43:42 compute-0 sudo[199108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:42 compute-0 python3.9[199110]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:42 compute-0 sudo[199108]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:43 compute-0 sudo[199259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnbuybibqjolkrcxskekdehevzczkvkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719022.6949847-1220-181911744155285/AnsiballZ_copy.py'
Dec 02 23:43:43 compute-0 sudo[199259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:43 compute-0 python3.9[199261]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764719022.6949847-1220-181911744155285/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:43:43 compute-0 sudo[199259]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:43 compute-0 sudo[199335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlqzexkgmglrvugauagpfgzdryaxzhrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719022.6949847-1220-181911744155285/AnsiballZ_systemd.py'
Dec 02 23:43:43 compute-0 sudo[199335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:44 compute-0 python3.9[199337]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 02 23:43:44 compute-0 systemd[1]: Reloading.
Dec 02 23:43:44 compute-0 systemd-rc-local-generator[199362]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:43:44 compute-0 systemd-sysv-generator[199368]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:43:44 compute-0 sudo[199335]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:44 compute-0 sudo[199446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epntyhnlzlfoodshngcgzwcehqmynuwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719022.6949847-1220-181911744155285/AnsiballZ_systemd.py'
Dec 02 23:43:44 compute-0 sudo[199446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:44 compute-0 python3.9[199448]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:43:45 compute-0 systemd[1]: Reloading.
Dec 02 23:43:45 compute-0 systemd-rc-local-generator[199475]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:43:45 compute-0 systemd-sysv-generator[199481]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:43:45 compute-0 systemd[1]: Starting openstack_network_exporter container...
Dec 02 23:43:45 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:43:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e66e379d5848c967caea310b2b53caa61ff752804f3aa25a032ad19a58cb3f40/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e66e379d5848c967caea310b2b53caa61ff752804f3aa25a032ad19a58cb3f40/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e66e379d5848c967caea310b2b53caa61ff752804f3aa25a032ad19a58cb3f40/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:45 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84.
Dec 02 23:43:45 compute-0 podman[199487]: 2025-12-02 23:43:45.47774967 +0000 UTC m=+0.138788505 container init b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 23:43:45 compute-0 openstack_network_exporter[199503]: INFO    23:43:45 main.go:48: registering *bridge.Collector
Dec 02 23:43:45 compute-0 openstack_network_exporter[199503]: INFO    23:43:45 main.go:48: registering *coverage.Collector
Dec 02 23:43:45 compute-0 openstack_network_exporter[199503]: INFO    23:43:45 main.go:48: registering *datapath.Collector
Dec 02 23:43:45 compute-0 openstack_network_exporter[199503]: INFO    23:43:45 main.go:48: registering *iface.Collector
Dec 02 23:43:45 compute-0 openstack_network_exporter[199503]: INFO    23:43:45 main.go:48: registering *memory.Collector
Dec 02 23:43:45 compute-0 openstack_network_exporter[199503]: INFO    23:43:45 main.go:48: registering *ovnnorthd.Collector
Dec 02 23:43:45 compute-0 openstack_network_exporter[199503]: INFO    23:43:45 main.go:48: registering *ovn.Collector
Dec 02 23:43:45 compute-0 openstack_network_exporter[199503]: INFO    23:43:45 main.go:48: registering *ovsdbserver.Collector
Dec 02 23:43:45 compute-0 openstack_network_exporter[199503]: INFO    23:43:45 main.go:48: registering *pmd_perf.Collector
Dec 02 23:43:45 compute-0 openstack_network_exporter[199503]: INFO    23:43:45 main.go:48: registering *pmd_rxq.Collector
Dec 02 23:43:45 compute-0 openstack_network_exporter[199503]: INFO    23:43:45 main.go:48: registering *vswitch.Collector
Dec 02 23:43:45 compute-0 openstack_network_exporter[199503]: NOTICE  23:43:45 main.go:76: listening on https://:9105/metrics
Dec 02 23:43:45 compute-0 podman[199487]: 2025-12-02 23:43:45.512847575 +0000 UTC m=+0.173886370 container start b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 23:43:45 compute-0 podman[199487]: openstack_network_exporter
Dec 02 23:43:45 compute-0 systemd[1]: Started openstack_network_exporter container.
Dec 02 23:43:45 compute-0 sudo[199446]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:45 compute-0 podman[199508]: 2025-12-02 23:43:45.591649556 +0000 UTC m=+0.069866284 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Dec 02 23:43:46 compute-0 irqbalance[791]: Cannot change IRQ 26 affinity: Operation not permitted
Dec 02 23:43:46 compute-0 irqbalance[791]: IRQ 26 affinity is now unmanaged
Dec 02 23:43:46 compute-0 sudo[199686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrmfircbddequmwxlkogdmrtpgfyaqwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719026.5215597-1268-208379623189636/AnsiballZ_systemd.py'
Dec 02 23:43:46 compute-0 sudo[199686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:47 compute-0 python3.9[199688]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:43:47 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Dec 02 23:43:47 compute-0 systemd[1]: libpod-b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84.scope: Deactivated successfully.
Dec 02 23:43:47 compute-0 podman[199692]: 2025-12-02 23:43:47.273609264 +0000 UTC m=+0.080213806 container died b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, vcs-type=git)
Dec 02 23:43:47 compute-0 systemd[1]: b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84-602f1046343c982e.timer: Deactivated successfully.
Dec 02 23:43:47 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84.
Dec 02 23:43:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84-userdata-shm.mount: Deactivated successfully.
Dec 02 23:43:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-e66e379d5848c967caea310b2b53caa61ff752804f3aa25a032ad19a58cb3f40-merged.mount: Deactivated successfully.
Dec 02 23:43:48 compute-0 podman[199692]: 2025-12-02 23:43:48.2428068 +0000 UTC m=+1.049411312 container cleanup b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 23:43:48 compute-0 podman[199692]: openstack_network_exporter
Dec 02 23:43:48 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 02 23:43:48 compute-0 podman[199718]: openstack_network_exporter
Dec 02 23:43:48 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec 02 23:43:48 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Dec 02 23:43:48 compute-0 systemd[1]: Starting openstack_network_exporter container...
Dec 02 23:43:48 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:43:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e66e379d5848c967caea310b2b53caa61ff752804f3aa25a032ad19a58cb3f40/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e66e379d5848c967caea310b2b53caa61ff752804f3aa25a032ad19a58cb3f40/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e66e379d5848c967caea310b2b53caa61ff752804f3aa25a032ad19a58cb3f40/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 02 23:43:48 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84.
Dec 02 23:43:48 compute-0 podman[199729]: 2025-12-02 23:43:48.567473394 +0000 UTC m=+0.174318210 container init b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 23:43:48 compute-0 openstack_network_exporter[199746]: INFO    23:43:48 main.go:48: registering *bridge.Collector
Dec 02 23:43:48 compute-0 openstack_network_exporter[199746]: INFO    23:43:48 main.go:48: registering *coverage.Collector
Dec 02 23:43:48 compute-0 openstack_network_exporter[199746]: INFO    23:43:48 main.go:48: registering *datapath.Collector
Dec 02 23:43:48 compute-0 openstack_network_exporter[199746]: INFO    23:43:48 main.go:48: registering *iface.Collector
Dec 02 23:43:48 compute-0 openstack_network_exporter[199746]: INFO    23:43:48 main.go:48: registering *memory.Collector
Dec 02 23:43:48 compute-0 openstack_network_exporter[199746]: INFO    23:43:48 main.go:48: registering *ovnnorthd.Collector
Dec 02 23:43:48 compute-0 openstack_network_exporter[199746]: INFO    23:43:48 main.go:48: registering *ovn.Collector
Dec 02 23:43:48 compute-0 openstack_network_exporter[199746]: INFO    23:43:48 main.go:48: registering *ovsdbserver.Collector
Dec 02 23:43:48 compute-0 openstack_network_exporter[199746]: INFO    23:43:48 main.go:48: registering *pmd_perf.Collector
Dec 02 23:43:48 compute-0 openstack_network_exporter[199746]: INFO    23:43:48 main.go:48: registering *pmd_rxq.Collector
Dec 02 23:43:48 compute-0 openstack_network_exporter[199746]: INFO    23:43:48 main.go:48: registering *vswitch.Collector
Dec 02 23:43:48 compute-0 openstack_network_exporter[199746]: NOTICE  23:43:48 main.go:76: listening on https://:9105/metrics
Dec 02 23:43:48 compute-0 podman[199729]: 2025-12-02 23:43:48.602273492 +0000 UTC m=+0.209118278 container start b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 23:43:48 compute-0 podman[199729]: openstack_network_exporter
Dec 02 23:43:48 compute-0 systemd[1]: Started openstack_network_exporter container.
Dec 02 23:43:48 compute-0 sudo[199686]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:48 compute-0 podman[199756]: 2025-12-02 23:43:48.690926413 +0000 UTC m=+0.071693398 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container)
Dec 02 23:43:49 compute-0 sudo[199929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qokyeuinriwwazhjgadiccxpzybmchzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719029.2167363-1284-91976544793862/AnsiballZ_find.py'
Dec 02 23:43:49 compute-0 sudo[199929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:43:49 compute-0 python3.9[199931]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 02 23:43:49 compute-0 sudo[199929]: pam_unix(sudo:session): session closed for user root
Dec 02 23:43:52 compute-0 podman[199956]: 2025-12-02 23:43:52.146722182 +0000 UTC m=+0.091504532 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 02 23:44:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:44:00.659 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:44:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:44:00.660 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:44:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:44:00.660 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:44:04 compute-0 podman[199977]: 2025-12-02 23:44:04.098308222 +0000 UTC m=+0.052770157 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:44:07 compute-0 podman[200003]: 2025-12-02 23:44:07.127973103 +0000 UTC m=+0.076322391 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:44:07 compute-0 podman[200004]: 2025-12-02 23:44:07.143380549 +0000 UTC m=+0.092688480 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 23:44:11 compute-0 sudo[200172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fafmskdjldhkjrjqujrecsfrrflpvhga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719051.0189583-1468-156415149653753/AnsiballZ_podman_container_info.py'
Dec 02 23:44:11 compute-0 sudo[200172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:11 compute-0 python3.9[200174]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec 02 23:44:11 compute-0 sudo[200172]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:12 compute-0 sudo[200337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgvavxaxgccfxehpyjjqdgdxwzkhywat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719051.9459913-1476-32196344774435/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:12 compute-0 sudo[200337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:12 compute-0 python3.9[200339]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:12 compute-0 systemd[1]: Started libpod-conmon-e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6.scope.
Dec 02 23:44:12 compute-0 podman[200340]: 2025-12-02 23:44:12.788165905 +0000 UTC m=+0.089394170 container exec e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 02 23:44:12 compute-0 podman[200340]: 2025-12-02 23:44:12.819095419 +0000 UTC m=+0.120323674 container exec_died e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202)
Dec 02 23:44:12 compute-0 systemd[1]: libpod-conmon-e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6.scope: Deactivated successfully.
Dec 02 23:44:12 compute-0 sudo[200337]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:13 compute-0 sudo[200522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlypochuvthoggffzwbkflbrzjozwejn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719053.1298735-1484-77933648000720/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:13 compute-0 sudo[200522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:13 compute-0 python3.9[200524]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:13 compute-0 systemd[1]: Started libpod-conmon-e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6.scope.
Dec 02 23:44:13 compute-0 podman[200525]: 2025-12-02 23:44:13.817242549 +0000 UTC m=+0.059905901 container exec e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 02 23:44:13 compute-0 podman[200525]: 2025-12-02 23:44:13.846109133 +0000 UTC m=+0.088772475 container exec_died e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 23:44:13 compute-0 systemd[1]: libpod-conmon-e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6.scope: Deactivated successfully.
Dec 02 23:44:13 compute-0 sudo[200522]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:14 compute-0 sudo[200708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtgunvrqlrofreleomermfargirprcyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719054.0935705-1492-22850757075598/AnsiballZ_file.py'
Dec 02 23:44:14 compute-0 sudo[200708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:14 compute-0 python3.9[200710]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:14 compute-0 sudo[200708]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:15 compute-0 sudo[200860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prqvthkccfewfteeqkolwszkrjajwcus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719055.009245-1501-147872665544959/AnsiballZ_podman_container_info.py'
Dec 02 23:44:15 compute-0 sudo[200860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:15 compute-0 python3.9[200862]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec 02 23:44:15 compute-0 sudo[200860]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:16 compute-0 auditd[704]: Audit daemon rotating log files
Dec 02 23:44:16 compute-0 sudo[201025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieuxnwpnbvhupovzasmrlxwtsqohmvee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719055.8582306-1509-29110245580545/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:16 compute-0 sudo[201025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:16 compute-0 python3.9[201027]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:16 compute-0 systemd[1]: Started libpod-conmon-282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41.scope.
Dec 02 23:44:16 compute-0 podman[201028]: 2025-12-02 23:44:16.537786556 +0000 UTC m=+0.098797360 container exec 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:44:16 compute-0 podman[201028]: 2025-12-02 23:44:16.570982337 +0000 UTC m=+0.131993101 container exec_died 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:44:16 compute-0 sudo[201025]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:16 compute-0 systemd[1]: libpod-conmon-282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41.scope: Deactivated successfully.
Dec 02 23:44:17 compute-0 sudo[201208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oskhvlyggzyjdpeofwarpwbdnvtpmzjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719056.7747982-1517-232550517107858/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:17 compute-0 sudo[201208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:17 compute-0 python3.9[201210]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:17 compute-0 systemd[1]: Started libpod-conmon-282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41.scope.
Dec 02 23:44:17 compute-0 podman[201211]: 2025-12-02 23:44:17.389535001 +0000 UTC m=+0.093390474 container exec 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 02 23:44:17 compute-0 podman[201211]: 2025-12-02 23:44:17.421045521 +0000 UTC m=+0.124901034 container exec_died 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 02 23:44:17 compute-0 systemd[1]: libpod-conmon-282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41.scope: Deactivated successfully.
Dec 02 23:44:17 compute-0 sudo[201208]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:17 compute-0 sudo[201392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydyaffddgjyweolhrfdrcyzaeuoghcmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719057.6638973-1525-264789308218436/AnsiballZ_file.py'
Dec 02 23:44:18 compute-0 sudo[201392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:18 compute-0 python3.9[201394]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:18 compute-0 sudo[201392]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:18 compute-0 sudo[201555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjwuuvraqhpiavnymtyejtxifkhktxtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719058.523005-1534-163083487435732/AnsiballZ_podman_container_info.py'
Dec 02 23:44:18 compute-0 sudo[201555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:18 compute-0 podman[201518]: 2025-12-02 23:44:18.90435398 +0000 UTC m=+0.091662741 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350)
Dec 02 23:44:19 compute-0 python3.9[201557]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec 02 23:44:19 compute-0 sudo[201555]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:19 compute-0 sudo[201729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuoiysfovowfcfpimztrfiklrefbirqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719059.402004-1542-115986261495441/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:19 compute-0 sudo[201729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:19 compute-0 python3.9[201731]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:20 compute-0 systemd[1]: Started libpod-conmon-a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a.scope.
Dec 02 23:44:20 compute-0 podman[201732]: 2025-12-02 23:44:20.06484574 +0000 UTC m=+0.075392164 container exec a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec 02 23:44:20 compute-0 podman[201732]: 2025-12-02 23:44:20.097828516 +0000 UTC m=+0.108374910 container exec_died a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:44:20 compute-0 systemd[1]: libpod-conmon-a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a.scope: Deactivated successfully.
Dec 02 23:44:20 compute-0 sudo[201729]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:20 compute-0 sudo[201913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwfjyikoabpoumzcyqzetbmfhruyqirv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719060.3253822-1550-215444294436061/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:20 compute-0 sudo[201913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:20 compute-0 python3.9[201915]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:20 compute-0 systemd[1]: Started libpod-conmon-a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a.scope.
Dec 02 23:44:20 compute-0 podman[201916]: 2025-12-02 23:44:20.952893212 +0000 UTC m=+0.087475359 container exec a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 23:44:20 compute-0 podman[201916]: 2025-12-02 23:44:20.988988194 +0000 UTC m=+0.123570311 container exec_died a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 23:44:21 compute-0 systemd[1]: libpod-conmon-a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a.scope: Deactivated successfully.
Dec 02 23:44:21 compute-0 sudo[201913]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:21 compute-0 nova_compute[187243]: 2025-12-02 23:44:21.045 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-0 nova_compute[187243]: 2025-12-02 23:44:21.047 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-0 nova_compute[187243]: 2025-12-02 23:44:21.564 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-0 nova_compute[187243]: 2025-12-02 23:44:21.564 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-0 nova_compute[187243]: 2025-12-02 23:44:21.564 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-0 nova_compute[187243]: 2025-12-02 23:44:21.564 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-0 nova_compute[187243]: 2025-12-02 23:44:21.565 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-0 nova_compute[187243]: 2025-12-02 23:44:21.565 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-0 nova_compute[187243]: 2025-12-02 23:44:21.565 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:44:21 compute-0 nova_compute[187243]: 2025-12-02 23:44:21.565 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:44:21 compute-0 sudo[202098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmxkccziedpwaxeoslcmfcttjjsefqix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719061.3330243-1558-234800710925203/AnsiballZ_file.py'
Dec 02 23:44:21 compute-0 sudo[202098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:21 compute-0 python3.9[202100]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:21 compute-0 sudo[202098]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:22 compute-0 nova_compute[187243]: 2025-12-02 23:44:22.078 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:44:22 compute-0 nova_compute[187243]: 2025-12-02 23:44:22.079 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:44:22 compute-0 nova_compute[187243]: 2025-12-02 23:44:22.079 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:44:22 compute-0 nova_compute[187243]: 2025-12-02 23:44:22.080 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:44:22 compute-0 nova_compute[187243]: 2025-12-02 23:44:22.248 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:44:22 compute-0 nova_compute[187243]: 2025-12-02 23:44:22.250 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:44:22 compute-0 nova_compute[187243]: 2025-12-02 23:44:22.270 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:44:22 compute-0 nova_compute[187243]: 2025-12-02 23:44:22.272 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6009MB free_disk=73.20033264160156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:44:22 compute-0 nova_compute[187243]: 2025-12-02 23:44:22.272 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:44:22 compute-0 nova_compute[187243]: 2025-12-02 23:44:22.273 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:44:22 compute-0 sudo[202266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-newaydrpzyyfzdiclsskmvcmkdlmbekt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719062.162839-1567-123638623027814/AnsiballZ_podman_container_info.py'
Dec 02 23:44:22 compute-0 sudo[202266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:22 compute-0 podman[202225]: 2025-12-02 23:44:22.474125848 +0000 UTC m=+0.066273841 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 02 23:44:22 compute-0 python3.9[202271]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec 02 23:44:22 compute-0 sudo[202266]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:23 compute-0 sudo[202432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nswtmofrlfqjiygbnfyrcoejmbzcapyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719063.0037377-1575-106305338539023/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:23 compute-0 sudo[202432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:23 compute-0 nova_compute[187243]: 2025-12-02 23:44:23.360 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:44:23 compute-0 nova_compute[187243]: 2025-12-02 23:44:23.361 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:44:22 up 52 min,  0 user,  load average: 0.83, 0.83, 0.65\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:44:23 compute-0 nova_compute[187243]: 2025-12-02 23:44:23.391 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:44:23 compute-0 python3.9[202434]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:23 compute-0 systemd[1]: Started libpod-conmon-28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d.scope.
Dec 02 23:44:23 compute-0 podman[202435]: 2025-12-02 23:44:23.692145123 +0000 UTC m=+0.116619301 container exec 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:44:23 compute-0 podman[202435]: 2025-12-02 23:44:23.72269042 +0000 UTC m=+0.147164608 container exec_died 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:44:23 compute-0 systemd[1]: libpod-conmon-28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d.scope: Deactivated successfully.
Dec 02 23:44:23 compute-0 sudo[202432]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:23 compute-0 nova_compute[187243]: 2025-12-02 23:44:23.905 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:44:24 compute-0 sudo[202618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwvnhqstkzgdddvwwrwkyvftqpphtcic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719063.974216-1583-246551813806349/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:24 compute-0 sudo[202618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:24 compute-0 nova_compute[187243]: 2025-12-02 23:44:24.414 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:44:24 compute-0 nova_compute[187243]: 2025-12-02 23:44:24.415 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.142s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:44:24 compute-0 python3.9[202620]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:24 compute-0 systemd[1]: Started libpod-conmon-28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d.scope.
Dec 02 23:44:24 compute-0 podman[202621]: 2025-12-02 23:44:24.58651903 +0000 UTC m=+0.079375541 container exec 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:44:24 compute-0 podman[202621]: 2025-12-02 23:44:24.621896775 +0000 UTC m=+0.114753206 container exec_died 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:44:24 compute-0 systemd[1]: libpod-conmon-28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d.scope: Deactivated successfully.
Dec 02 23:44:24 compute-0 sudo[202618]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:25 compute-0 sudo[202802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fptbxkotffttaractwdffvnwshatnlyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719064.9153805-1591-230146107862255/AnsiballZ_file.py'
Dec 02 23:44:25 compute-0 sudo[202802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:25 compute-0 python3.9[202804]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:25 compute-0 sudo[202802]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:26 compute-0 sudo[202954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixrlaxwsmdiobqlsdyrfemvnpprzovgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719065.7330253-1600-171429195617830/AnsiballZ_podman_container_info.py'
Dec 02 23:44:26 compute-0 sudo[202954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:26 compute-0 python3.9[202956]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec 02 23:44:26 compute-0 sudo[202954]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:26 compute-0 sudo[203119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpshgookhaffexpjmmyyjqcmdcwnidie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719066.5870311-1608-55858244709095/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:26 compute-0 sudo[203119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:27 compute-0 python3.9[203121]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:27 compute-0 systemd[1]: Started libpod-conmon-b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84.scope.
Dec 02 23:44:27 compute-0 podman[203122]: 2025-12-02 23:44:27.240766814 +0000 UTC m=+0.080747324 container exec b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, version=9.6, io.buildah.version=1.33.7)
Dec 02 23:44:27 compute-0 podman[203122]: 2025-12-02 23:44:27.271139656 +0000 UTC m=+0.111120156 container exec_died b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, version=9.6, name=ubi9-minimal, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9)
Dec 02 23:44:27 compute-0 systemd[1]: libpod-conmon-b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84.scope: Deactivated successfully.
Dec 02 23:44:27 compute-0 sudo[203119]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:27 compute-0 sudo[203303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vltdaetjrdvsorclqvkzfaaflficwdwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719067.532328-1616-56615530805195/AnsiballZ_podman_container_exec.py'
Dec 02 23:44:27 compute-0 sudo[203303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:28 compute-0 python3.9[203305]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 02 23:44:28 compute-0 systemd[1]: Started libpod-conmon-b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84.scope.
Dec 02 23:44:28 compute-0 podman[203306]: 2025-12-02 23:44:28.153378697 +0000 UTC m=+0.083522252 container exec b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, architecture=x86_64)
Dec 02 23:44:28 compute-0 podman[203306]: 2025-12-02 23:44:28.184448806 +0000 UTC m=+0.114592291 container exec_died b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Dec 02 23:44:28 compute-0 systemd[1]: libpod-conmon-b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84.scope: Deactivated successfully.
Dec 02 23:44:28 compute-0 sudo[203303]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:28 compute-0 sudo[203489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgsdqtstqhpbaqnhqzrlevdljcegrigf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719068.4719934-1624-207196248301407/AnsiballZ_file.py'
Dec 02 23:44:28 compute-0 sudo[203489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:29 compute-0 python3.9[203491]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:29 compute-0 sudo[203489]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:29 compute-0 sudo[203641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlsgunvfcytgseldrkmksjzrcfwhdugz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719069.3598168-1634-236757326915677/AnsiballZ_file.py'
Dec 02 23:44:29 compute-0 sudo[203641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:29 compute-0 python3.9[203643]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:29 compute-0 sudo[203641]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:30 compute-0 sudo[203793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdxlnaexrabzguogbvrnsxizunnnesxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719070.1986995-1650-223420900054079/AnsiballZ_stat.py'
Dec 02 23:44:30 compute-0 sudo[203793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:30 compute-0 python3.9[203795]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:30 compute-0 sudo[203793]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:31 compute-0 sudo[203916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-allbsxbxnvxrjyefadmbtoduoipqgmgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719070.1986995-1650-223420900054079/AnsiballZ_copy.py'
Dec 02 23:44:31 compute-0 sudo[203916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:31 compute-0 python3.9[203918]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764719070.1986995-1650-223420900054079/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:31 compute-0 sudo[203916]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:32 compute-0 sudo[204070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqmuqsrhiwynjlonlxrembxocjqpxvyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719072.115026-1682-25425138370387/AnsiballZ_file.py'
Dec 02 23:44:32 compute-0 sudo[204070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:32 compute-0 python3.9[204072]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:32 compute-0 sudo[204070]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:33 compute-0 sshd-session[203971]: Invalid user syncuser from 101.47.140.127 port 35396
Dec 02 23:44:33 compute-0 sshd-session[203971]: Received disconnect from 101.47.140.127 port 35396:11: Bye Bye [preauth]
Dec 02 23:44:33 compute-0 sshd-session[203971]: Disconnected from invalid user syncuser 101.47.140.127 port 35396 [preauth]
Dec 02 23:44:33 compute-0 sudo[204222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woskrzzcmateitnopqoieiakfewgitmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719073.5181656-1698-149697884840892/AnsiballZ_stat.py'
Dec 02 23:44:33 compute-0 sudo[204222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:34 compute-0 python3.9[204224]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:34 compute-0 sudo[204222]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:34 compute-0 sudo[204310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxgivdzvrrhglbuiqbyxgjhoxxyrrwub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719073.5181656-1698-149697884840892/AnsiballZ_file.py'
Dec 02 23:44:34 compute-0 sudo[204310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:34 compute-0 podman[204274]: 2025-12-02 23:44:34.44253003 +0000 UTC m=+0.080992220 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:44:34 compute-0 python3.9[204326]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:34 compute-0 sudo[204310]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:35 compute-0 sudo[204476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybrnicfqbrygiszzbdiewcuxcykhieon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719074.916081-1722-121520226824051/AnsiballZ_stat.py'
Dec 02 23:44:35 compute-0 sudo[204476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:35 compute-0 python3.9[204478]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:35 compute-0 sudo[204476]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:35 compute-0 sudo[204554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxbkxxqhvnddqgrpxgvdhrhynfzkuzcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719074.916081-1722-121520226824051/AnsiballZ_file.py'
Dec 02 23:44:35 compute-0 sudo[204554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:36 compute-0 python3.9[204556]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.hk24q9bo recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:36 compute-0 sudo[204554]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:36 compute-0 sudo[204706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjghpueyowvynaksjkakyffxjgontphg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719076.4127436-1746-170276984077677/AnsiballZ_stat.py'
Dec 02 23:44:36 compute-0 sudo[204706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:36 compute-0 python3.9[204708]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:37 compute-0 sudo[204706]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:37 compute-0 podman[204758]: 2025-12-02 23:44:37.270717305 +0000 UTC m=+0.053392496 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Dec 02 23:44:37 compute-0 sudo[204813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eevzshfionamwqxkmwammpuhqxeomrzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719076.4127436-1746-170276984077677/AnsiballZ_file.py'
Dec 02 23:44:37 compute-0 sudo[204813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:37 compute-0 podman[204759]: 2025-12-02 23:44:37.337350253 +0000 UTC m=+0.107933298 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:44:37 compute-0 python3.9[204827]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:37 compute-0 sudo[204813]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:38 compute-0 sudo[204980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbjmjmcsrhmyupfukgocxvlmdcjownlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719077.987566-1772-269865966049524/AnsiballZ_command.py'
Dec 02 23:44:38 compute-0 sudo[204980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:38 compute-0 python3.9[204982]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:44:38 compute-0 sudo[204980]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:39 compute-0 sudo[205133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atkgsaxjhiapzirhtipducdxcvzlbalv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764719078.942168-1788-274496043238099/AnsiballZ_edpm_nftables_from_files.py'
Dec 02 23:44:39 compute-0 sudo[205133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:39 compute-0 python3[205135]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 02 23:44:39 compute-0 sudo[205133]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:40 compute-0 sudo[205285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xizfnjqxbgemrnjberiasrwwxpqshsws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719079.9498432-1804-269634875896573/AnsiballZ_stat.py'
Dec 02 23:44:40 compute-0 sudo[205285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:40 compute-0 python3.9[205287]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:40 compute-0 sudo[205285]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:40 compute-0 sudo[205363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhowazavtdfifycvnrljubstaqmzqayk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719079.9498432-1804-269634875896573/AnsiballZ_file.py'
Dec 02 23:44:40 compute-0 sudo[205363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:41 compute-0 python3.9[205365]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:41 compute-0 sudo[205363]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:41 compute-0 sudo[205515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvmqnfnumucwsggokhjcdugldgzvvvch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719081.3749754-1828-247110569678786/AnsiballZ_stat.py'
Dec 02 23:44:41 compute-0 sudo[205515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:42 compute-0 python3.9[205517]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:42 compute-0 sudo[205515]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:42 compute-0 sudo[205593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlrvamtrfewpfxnaraxlmqjngjwwjyxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719081.3749754-1828-247110569678786/AnsiballZ_file.py'
Dec 02 23:44:42 compute-0 sudo[205593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:42 compute-0 python3.9[205595]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:42 compute-0 sudo[205593]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:43 compute-0 sudo[205745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxdlsnzdlqdzjtnqapcdybeeyjdclsay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719082.8568544-1852-130142416042537/AnsiballZ_stat.py'
Dec 02 23:44:43 compute-0 sudo[205745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:43 compute-0 python3.9[205747]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:43 compute-0 sudo[205745]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:43 compute-0 sudo[205823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssitylmjibbqmyglvdmyaqmzmivfwyoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719082.8568544-1852-130142416042537/AnsiballZ_file.py'
Dec 02 23:44:43 compute-0 sudo[205823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:43 compute-0 python3.9[205825]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:43 compute-0 sudo[205823]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:44 compute-0 sudo[205975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxlltxmkkwwswdkuutbseppsrezvrdvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719084.2446005-1876-171169871577620/AnsiballZ_stat.py'
Dec 02 23:44:44 compute-0 sudo[205975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:44 compute-0 python3.9[205977]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:44 compute-0 sudo[205975]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:45 compute-0 sudo[206053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfquqizlhylamtusnfyrxbndbgrwpwqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719084.2446005-1876-171169871577620/AnsiballZ_file.py'
Dec 02 23:44:45 compute-0 sudo[206053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:45 compute-0 python3.9[206055]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:45 compute-0 sudo[206053]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:46 compute-0 sudo[206205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmiyouegtkjchdmriziavptrzonuwkjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719085.7623026-1900-273099544771323/AnsiballZ_stat.py'
Dec 02 23:44:46 compute-0 sudo[206205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:46 compute-0 python3.9[206207]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 02 23:44:46 compute-0 sudo[206205]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:46 compute-0 sudo[206330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykwzuwfoypnohccpetzqvgqdqvosbnbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719085.7623026-1900-273099544771323/AnsiballZ_copy.py'
Dec 02 23:44:46 compute-0 sudo[206330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:47 compute-0 python3.9[206332]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764719085.7623026-1900-273099544771323/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:47 compute-0 sudo[206330]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:47 compute-0 sudo[206482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyompsxodcgfkbkuworhywmloigbtlfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719087.3495858-1930-246261103691806/AnsiballZ_file.py'
Dec 02 23:44:47 compute-0 sudo[206482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:47 compute-0 python3.9[206484]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:47 compute-0 sudo[206482]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:48 compute-0 sudo[206634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwqinuxmcnyvrgqtyykzwrvxzbsbqxst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719088.1804137-1946-113118971009769/AnsiballZ_command.py'
Dec 02 23:44:48 compute-0 sudo[206634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:48 compute-0 python3.9[206636]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:44:48 compute-0 sudo[206634]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:49 compute-0 podman[206664]: 2025-12-02 23:44:49.127104041 +0000 UTC m=+0.049064451 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 02 23:44:49 compute-0 sudo[206812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfjloosincxkirbnboteloxoqlzfcmau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719089.0904899-1962-181029195496777/AnsiballZ_blockinfile.py'
Dec 02 23:44:49 compute-0 sudo[206812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:49 compute-0 python3.9[206814]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:49 compute-0 sudo[206812]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:50 compute-0 sudo[206966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuvmydijenqqwlqdffmnwjnkqkuwdwyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719090.0906758-1980-49120313038464/AnsiballZ_command.py'
Dec 02 23:44:50 compute-0 sudo[206966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:50 compute-0 python3.9[206968]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:44:50 compute-0 sudo[206966]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:51 compute-0 sshd-session[206839]: Received disconnect from 49.247.36.49 port 24657:11: Bye Bye [preauth]
Dec 02 23:44:51 compute-0 sshd-session[206839]: Disconnected from authenticating user root 49.247.36.49 port 24657 [preauth]
Dec 02 23:44:51 compute-0 sudo[207119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-assnfwmqmbslfgnyvriozbwepihnsqkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719091.061213-1996-180064716279551/AnsiballZ_stat.py'
Dec 02 23:44:51 compute-0 sudo[207119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:51 compute-0 python3.9[207121]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 02 23:44:51 compute-0 sudo[207119]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:52 compute-0 sudo[207273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fskaxvktsnxhymathhciyvziohdzhtvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719091.9235027-2012-111057189608273/AnsiballZ_command.py'
Dec 02 23:44:52 compute-0 sudo[207273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:52 compute-0 python3.9[207275]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:44:52 compute-0 sudo[207273]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:53 compute-0 podman[207402]: 2025-12-02 23:44:53.101184249 +0000 UTC m=+0.059272439 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:44:53 compute-0 sudo[207448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmfmxnrccjzbkbhncizvhzkmiwngflaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764719092.7942567-2028-19787256583962/AnsiballZ_file.py'
Dec 02 23:44:53 compute-0 sudo[207448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:44:53 compute-0 python3.9[207451]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:44:53 compute-0 sudo[207448]: pam_unix(sudo:session): session closed for user root
Dec 02 23:44:53 compute-0 sshd-session[187561]: Connection closed by 192.168.122.30 port 33720
Dec 02 23:44:53 compute-0 sshd-session[187558]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:44:53 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Dec 02 23:44:53 compute-0 systemd[1]: session-26.scope: Consumed 1min 27.194s CPU time.
Dec 02 23:44:53 compute-0 systemd-logind[795]: Session 26 logged out. Waiting for processes to exit.
Dec 02 23:44:53 compute-0 systemd-logind[795]: Removed session 26.
Dec 02 23:44:55 compute-0 sshd-session[207452]: Invalid user 12345 from 80.94.95.116 port 17230
Dec 02 23:44:56 compute-0 sshd-session[207452]: Connection closed by invalid user 12345 80.94.95.116 port 17230 [preauth]
Dec 02 23:44:59 compute-0 podman[197600]: time="2025-12-02T23:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:44:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:44:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2553 "" "Go-http-client/1.1"
Dec 02 23:44:59 compute-0 sshd-session[207478]: Accepted publickey for zuul from 38.102.83.66 port 42104 ssh2: RSA SHA256:hdlXDg7PlzRXiLISnY+IUpp6Y3Jc5y9DXpVHJTD4Z4A
Dec 02 23:44:59 compute-0 systemd-logind[795]: New session 27 of user zuul.
Dec 02 23:44:59 compute-0 systemd[1]: Started Session 27 of User zuul.
Dec 02 23:44:59 compute-0 sshd-session[207478]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 02 23:44:59 compute-0 sudo[207507]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltrfdragzlbgznkegrldtbnmcsziawqu ; /usr/bin/python3'
Dec 02 23:44:59 compute-0 sudo[207507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:00 compute-0 python3[207509]: ansible-ansible.legacy.dnf Invoked with name=['nfs-utils', 'iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Dec 02 23:45:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:45:00.661 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:45:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:45:00.662 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:45:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:45:00.662 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:45:01 compute-0 openstack_network_exporter[199746]: ERROR   23:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:45:01 compute-0 openstack_network_exporter[199746]: ERROR   23:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:45:01 compute-0 openstack_network_exporter[199746]: ERROR   23:45:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:45:01 compute-0 openstack_network_exporter[199746]: ERROR   23:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:45:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:45:01 compute-0 openstack_network_exporter[199746]: ERROR   23:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:45:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:45:01 compute-0 sudo[207507]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:01 compute-0 sudo[207540]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sphguiqayfbdehlnlztndlshvnpsggqy ; /usr/bin/python3'
Dec 02 23:45:01 compute-0 sudo[207540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:01 compute-0 python3[207542]: ansible-community.general.ini_file Invoked with path=/etc/nfs.conf section=nfsd option=vers3 value=n backup=True mode=0644 state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:45:01 compute-0 sudo[207540]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:02 compute-0 sudo[207568]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwcxcwhcveepwlddmnkofvxtoheyvsta ; /usr/bin/python3'
Dec 02 23:45:02 compute-0 sudo[207568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:02 compute-0 python3[207570]: ansible-ansible.builtin.systemd_service Invoked with name=rpc-statd.service masked=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Dec 02 23:45:02 compute-0 systemd[1]: Reloading.
Dec 02 23:45:02 compute-0 systemd-rc-local-generator[207594]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:45:02 compute-0 systemd-sysv-generator[207597]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:45:03 compute-0 sudo[207568]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:03 compute-0 sudo[207631]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reosbawjhwldvimnzxilyexbchjiqfdo ; /usr/bin/python3'
Dec 02 23:45:03 compute-0 sudo[207631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:03 compute-0 python3[207633]: ansible-ansible.builtin.systemd_service Invoked with name=rpcbind.service masked=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Dec 02 23:45:03 compute-0 systemd[1]: Reloading.
Dec 02 23:45:03 compute-0 systemd-sysv-generator[207667]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:45:03 compute-0 systemd-rc-local-generator[207662]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:45:03 compute-0 systemd[1]: rpcbind.service: Current command vanished from the unit file, execution of the command list won't be resumed.
Dec 02 23:45:03 compute-0 sudo[207631]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:04 compute-0 sudo[207694]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxutwplqqfkggttrgrirhclydxmdpwyc ; /usr/bin/python3'
Dec 02 23:45:04 compute-0 sudo[207694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:04 compute-0 python3[207696]: ansible-ansible.builtin.systemd_service Invoked with name=rpcbind.socket masked=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Dec 02 23:45:04 compute-0 systemd[1]: Reloading.
Dec 02 23:45:04 compute-0 systemd-rc-local-generator[207729]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:45:04 compute-0 systemd-sysv-generator[207732]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:45:04 compute-0 systemd[1]: rpcbind.socket: Socket unit configuration has changed while unit has been running, no open socket file descriptor left. The socket unit is not functional until restarted.
Dec 02 23:45:04 compute-0 sudo[207694]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:04 compute-0 podman[207734]: 2025-12-02 23:45:04.699221211 +0000 UTC m=+0.057001924 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:45:04 compute-0 sudo[207782]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chkuwehhfofmfzzqtsmcqqtmkqqmegwp ; /usr/bin/python3'
Dec 02 23:45:04 compute-0 sudo[207782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:05 compute-0 python3[207784]: ansible-ansible.builtin.file Invoked with path=/data/cinder_backend_1 state=directory mode=755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:45:05 compute-0 sudo[207782]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:05 compute-0 sudo[207808]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncxzlmfuhnuksxostbkmdjuonldypmwu ; /usr/bin/python3'
Dec 02 23:45:05 compute-0 sudo[207808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:05 compute-0 python3[207810]: ansible-ansible.builtin.file Invoked with path=/data/cinder_backend_2 state=directory mode=755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:45:05 compute-0 sudo[207808]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:05 compute-0 sudo[207834]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guwcdaccdzeoeromhwfbkjvrexydiqhh ; /usr/bin/python3'
Dec 02 23:45:05 compute-0 sudo[207834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:05 compute-0 python3[207836]: ansible-ansible.builtin.file Invoked with path=/data/cinderbackup state=directory mode=755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:45:05 compute-0 sudo[207834]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:07 compute-0 sudo[207912]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxurtghcaocadpolyahywrvulbpleknk ; /usr/bin/python3'
Dec 02 23:45:07 compute-0 sudo[207912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:07 compute-0 podman[207914]: 2025-12-02 23:45:07.554431886 +0000 UTC m=+0.105972961 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Dec 02 23:45:07 compute-0 podman[207915]: 2025-12-02 23:45:07.587418952 +0000 UTC m=+0.134043987 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 02 23:45:07 compute-0 python3[207916]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/nfs-server.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 02 23:45:07 compute-0 sudo[207912]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:07 compute-0 sudo[208029]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjpepwhdttbvcrifvomczvlagodiqxvy ; /usr/bin/python3'
Dec 02 23:45:07 compute-0 sudo[208029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:08 compute-0 python3[208031]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/nfs-server.nft mode=0666 src=/home/zuul/.ansible/tmp/ansible-tmp-1764719107.269738-36741-75957196238249/source _original_basename=tmpw1ebq3iy follow=False checksum=f91e6a2e98f3d3c48705976f5b33f9e81e7cf7f4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:45:08 compute-0 sudo[208029]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:08 compute-0 sudo[208079]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovmfiqwopqvhududagmdrysqetmhnepq ; /usr/bin/python3'
Dec 02 23:45:08 compute-0 sudo[208079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:08 compute-0 python3[208081]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/sysconfig/nftables.conf line=include "/etc/nftables/nfs-server.nft" insertafter=EOF state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:45:08 compute-0 sudo[208079]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:09 compute-0 sudo[208105]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gijbjmpjvvryzzjvweohjphvlwqvxscm ; /usr/bin/python3'
Dec 02 23:45:09 compute-0 sudo[208105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:09 compute-0 python3[208107]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 02 23:45:09 compute-0 systemd[1]: Stopping Netfilter Tables...
Dec 02 23:45:09 compute-0 systemd[1]: nftables.service: Deactivated successfully.
Dec 02 23:45:09 compute-0 systemd[1]: Stopped Netfilter Tables.
Dec 02 23:45:09 compute-0 systemd[1]: Starting Netfilter Tables...
Dec 02 23:45:09 compute-0 systemd[1]: Finished Netfilter Tables.
Dec 02 23:45:09 compute-0 sudo[208105]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:09 compute-0 sudo[208135]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkcjqytkxysvqnaiucgclzyroktkicfp ; /usr/bin/python3'
Dec 02 23:45:09 compute-0 sudo[208135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:09 compute-0 python3[208137]: ansible-community.general.ini_file Invoked with path=/etc/nfs.conf section=nfsd option=host value=172.18.0.100 backup=True mode=0644 state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:45:09 compute-0 sudo[208135]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:10 compute-0 sudo[208163]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwxlyhuvoboacaxncihpdqbrgrboswav ; /usr/bin/python3'
Dec 02 23:45:10 compute-0 sudo[208163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:10 compute-0 python3[208165]: ansible-ansible.builtin.systemd Invoked with name=nfs-server state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 02 23:45:10 compute-0 systemd[1]: Reloading.
Dec 02 23:45:10 compute-0 systemd-rc-local-generator[208192]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 02 23:45:10 compute-0 systemd-sysv-generator[208198]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 02 23:45:10 compute-0 systemd[1]: rpcbind.socket: Socket unit configuration has changed while unit has been running, no open socket file descriptor left. The socket unit is not functional until restarted.
Dec 02 23:45:10 compute-0 systemd[1]: Mounting NFSD configuration filesystem...
Dec 02 23:45:10 compute-0 systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 02 23:45:10 compute-0 systemd[1]: Starting NFSv4 ID-name mapping service...
Dec 02 23:45:10 compute-0 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 02 23:45:10 compute-0 rpc.idmapd[208207]: Setting log level to 0
Dec 02 23:45:10 compute-0 systemd[1]: Started NFSv4 ID-name mapping service.
Dec 02 23:45:10 compute-0 systemd[1]: Mounted NFSD configuration filesystem.
Dec 02 23:45:10 compute-0 systemd[1]: Starting NFS Mount Daemon...
Dec 02 23:45:10 compute-0 systemd[1]: Starting NFSv4 Client Tracking Daemon...
Dec 02 23:45:10 compute-0 systemd[1]: Started NFSv4 Client Tracking Daemon.
Dec 02 23:45:10 compute-0 rpc.mountd[208214]: Version 2.5.4 starting
Dec 02 23:45:10 compute-0 systemd[1]: Started NFS Mount Daemon.
Dec 02 23:45:10 compute-0 systemd[1]: Starting NFS server and services...
Dec 02 23:45:10 compute-0 kernel: RPC: Registered rdma transport module.
Dec 02 23:45:10 compute-0 kernel: RPC: Registered rdma backchannel transport module.
Dec 02 23:45:11 compute-0 kernel: NFSD: Using nfsdcld client tracking operations.
Dec 02 23:45:11 compute-0 kernel: NFSD: no clients to reclaim, skipping NFSv4 grace period (net f0000000)
Dec 02 23:45:11 compute-0 systemd[1]: Reloading GSSAPI Proxy Daemon...
Dec 02 23:45:11 compute-0 systemd[1]: Reloaded GSSAPI Proxy Daemon.
Dec 02 23:45:11 compute-0 systemd[1]: Finished NFS server and services.
Dec 02 23:45:11 compute-0 sudo[208163]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:11 compute-0 sudo[208256]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grllanbtiwqqmsdrubfmodtkzfoksqeo ; /usr/bin/python3'
Dec 02 23:45:11 compute-0 sudo[208256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:11 compute-0 python3[208258]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/exports line=/data/cinder_backend_1 172.18.0.0/24(rw,sync,no_root_squash) state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:45:11 compute-0 sudo[208256]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:11 compute-0 sudo[208282]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yerbynimjrafouldrxafjwsczvaprglh ; /usr/bin/python3'
Dec 02 23:45:11 compute-0 sudo[208282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:11 compute-0 python3[208284]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/exports line=/data/cinder_backend_2 172.18.0.0/24(rw,sync,no_root_squash) state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:45:11 compute-0 sudo[208282]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:11 compute-0 sudo[208308]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kchzslhrwjtdrtlnqqnzfxqjaukqqbym ; /usr/bin/python3'
Dec 02 23:45:11 compute-0 sudo[208308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:12 compute-0 python3[208310]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/exports line=/data/cinderbackup 172.18.0.0/24(rw,sync,no_root_squash) state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 02 23:45:12 compute-0 sudo[208308]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:12 compute-0 sudo[208334]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sadgvqkxufvkzgawjxlycuuqvazexyqs ; /usr/bin/python3'
Dec 02 23:45:12 compute-0 sudo[208334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 02 23:45:12 compute-0 python3[208336]: ansible-ansible.legacy.command Invoked with _raw_params=exportfs -a _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 02 23:45:12 compute-0 sudo[208334]: pam_unix(sudo:session): session closed for user root
Dec 02 23:45:20 compute-0 podman[208338]: 2025-12-02 23:45:20.171465439 +0000 UTC m=+0.108382390 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 23:45:24 compute-0 podman[208358]: 2025-12-02 23:45:24.122611875 +0000 UTC m=+0.073416230 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd)
Dec 02 23:45:24 compute-0 nova_compute[187243]: 2025-12-02 23:45:24.416 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-0 nova_compute[187243]: 2025-12-02 23:45:24.417 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-0 nova_compute[187243]: 2025-12-02 23:45:24.417 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-0 nova_compute[187243]: 2025-12-02 23:45:24.417 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-0 nova_compute[187243]: 2025-12-02 23:45:24.418 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-0 nova_compute[187243]: 2025-12-02 23:45:24.418 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-0 nova_compute[187243]: 2025-12-02 23:45:24.418 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-0 nova_compute[187243]: 2025-12-02 23:45:24.418 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:45:24 compute-0 nova_compute[187243]: 2025-12-02 23:45:24.419 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:45:24 compute-0 nova_compute[187243]: 2025-12-02 23:45:24.972 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:45:24 compute-0 nova_compute[187243]: 2025-12-02 23:45:24.973 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:45:24 compute-0 nova_compute[187243]: 2025-12-02 23:45:24.973 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:45:24 compute-0 nova_compute[187243]: 2025-12-02 23:45:24.974 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:45:25 compute-0 nova_compute[187243]: 2025-12-02 23:45:25.125 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:45:25 compute-0 nova_compute[187243]: 2025-12-02 23:45:25.127 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:45:25 compute-0 nova_compute[187243]: 2025-12-02 23:45:25.142 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:45:25 compute-0 nova_compute[187243]: 2025-12-02 23:45:25.142 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6023MB free_disk=73.19946670532227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:45:25 compute-0 nova_compute[187243]: 2025-12-02 23:45:25.143 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:45:25 compute-0 nova_compute[187243]: 2025-12-02 23:45:25.143 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:45:26 compute-0 nova_compute[187243]: 2025-12-02 23:45:26.187 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:45:26 compute-0 nova_compute[187243]: 2025-12-02 23:45:26.187 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:45:25 up 53 min,  0 user,  load average: 0.66, 0.79, 0.65\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:45:26 compute-0 nova_compute[187243]: 2025-12-02 23:45:26.204 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:45:26 compute-0 nova_compute[187243]: 2025-12-02 23:45:26.710 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:45:27 compute-0 nova_compute[187243]: 2025-12-02 23:45:27.222 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:45:27 compute-0 nova_compute[187243]: 2025-12-02 23:45:27.223 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.080s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:45:29 compute-0 podman[197600]: time="2025-12-02T23:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:45:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:45:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2567 "" "Go-http-client/1.1"
Dec 02 23:45:31 compute-0 openstack_network_exporter[199746]: ERROR   23:45:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:45:31 compute-0 openstack_network_exporter[199746]: ERROR   23:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:45:31 compute-0 openstack_network_exporter[199746]: ERROR   23:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:45:31 compute-0 openstack_network_exporter[199746]: ERROR   23:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:45:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:45:31 compute-0 openstack_network_exporter[199746]: ERROR   23:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:45:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:45:35 compute-0 podman[208379]: 2025-12-02 23:45:35.160361757 +0000 UTC m=+0.106427034 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:45:38 compute-0 podman[208403]: 2025-12-02 23:45:38.124135451 +0000 UTC m=+0.076253180 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 23:45:38 compute-0 podman[208404]: 2025-12-02 23:45:38.210659919 +0000 UTC m=+0.159312293 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Dec 02 23:45:42 compute-0 sshd-session[208455]: Invalid user user8 from 45.78.219.213 port 51902
Dec 02 23:45:43 compute-0 sshd-session[208455]: Received disconnect from 45.78.219.213 port 51902:11: Bye Bye [preauth]
Dec 02 23:45:43 compute-0 sshd-session[208455]: Disconnected from invalid user user8 45.78.219.213 port 51902 [preauth]
Dec 02 23:45:51 compute-0 podman[208457]: 2025-12-02 23:45:51.155870145 +0000 UTC m=+0.092890524 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Dec 02 23:45:53 compute-0 sshd-session[208478]: Invalid user jenkins from 102.210.148.92 port 37760
Dec 02 23:45:53 compute-0 sshd-session[208478]: Received disconnect from 102.210.148.92 port 37760:11: Bye Bye [preauth]
Dec 02 23:45:53 compute-0 sshd-session[208478]: Disconnected from invalid user jenkins 102.210.148.92 port 37760 [preauth]
Dec 02 23:45:55 compute-0 podman[208480]: 2025-12-02 23:45:55.148614112 +0000 UTC m=+0.090102576 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Dec 02 23:45:59 compute-0 podman[197600]: time="2025-12-02T23:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:45:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:45:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2568 "" "Go-http-client/1.1"
Dec 02 23:46:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:46:00.663 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:46:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:46:00.663 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:46:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:46:00.663 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:46:01 compute-0 openstack_network_exporter[199746]: ERROR   23:46:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:46:01 compute-0 openstack_network_exporter[199746]: ERROR   23:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:46:01 compute-0 openstack_network_exporter[199746]: ERROR   23:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:46:01 compute-0 openstack_network_exporter[199746]: ERROR   23:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:46:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:46:01 compute-0 openstack_network_exporter[199746]: ERROR   23:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:46:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:46:06 compute-0 podman[208501]: 2025-12-02 23:46:06.128895664 +0000 UTC m=+0.074170068 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:46:09 compute-0 podman[208525]: 2025-12-02 23:46:09.088411002 +0000 UTC m=+0.047427487 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 02 23:46:09 compute-0 podman[208526]: 2025-12-02 23:46:09.123414855 +0000 UTC m=+0.080160855 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 02 23:46:19 compute-0 sshd-session[208569]: Invalid user elsearch from 49.247.36.49 port 43770
Dec 02 23:46:19 compute-0 sshd-session[208569]: Received disconnect from 49.247.36.49 port 43770:11: Bye Bye [preauth]
Dec 02 23:46:19 compute-0 sshd-session[208569]: Disconnected from invalid user elsearch 49.247.36.49 port 43770 [preauth]
Dec 02 23:46:22 compute-0 podman[208571]: 2025-12-02 23:46:22.132353091 +0000 UTC m=+0.082583953 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 23:46:24 compute-0 nova_compute[187243]: 2025-12-02 23:46:24.394 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:24 compute-0 nova_compute[187243]: 2025-12-02 23:46:24.395 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:24 compute-0 nova_compute[187243]: 2025-12-02 23:46:24.917 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:24 compute-0 nova_compute[187243]: 2025-12-02 23:46:24.918 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:24 compute-0 nova_compute[187243]: 2025-12-02 23:46:24.918 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:24 compute-0 nova_compute[187243]: 2025-12-02 23:46:24.918 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:24 compute-0 nova_compute[187243]: 2025-12-02 23:46:24.919 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:24 compute-0 nova_compute[187243]: 2025-12-02 23:46:24.919 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:24 compute-0 nova_compute[187243]: 2025-12-02 23:46:24.920 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:46:24 compute-0 nova_compute[187243]: 2025-12-02 23:46:24.920 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:46:25 compute-0 nova_compute[187243]: 2025-12-02 23:46:25.442 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:46:25 compute-0 nova_compute[187243]: 2025-12-02 23:46:25.442 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:46:25 compute-0 nova_compute[187243]: 2025-12-02 23:46:25.443 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:46:25 compute-0 nova_compute[187243]: 2025-12-02 23:46:25.443 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:46:25 compute-0 nova_compute[187243]: 2025-12-02 23:46:25.672 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:46:25 compute-0 nova_compute[187243]: 2025-12-02 23:46:25.674 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:46:25 compute-0 nova_compute[187243]: 2025-12-02 23:46:25.703 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:46:25 compute-0 nova_compute[187243]: 2025-12-02 23:46:25.704 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6102MB free_disk=73.20331192016602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:46:25 compute-0 nova_compute[187243]: 2025-12-02 23:46:25.705 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:46:25 compute-0 nova_compute[187243]: 2025-12-02 23:46:25.706 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:46:26 compute-0 podman[208594]: 2025-12-02 23:46:26.135700417 +0000 UTC m=+0.084587982 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:46:26 compute-0 nova_compute[187243]: 2025-12-02 23:46:26.763 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:46:26 compute-0 nova_compute[187243]: 2025-12-02 23:46:26.764 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:46:25 up 54 min,  0 user,  load average: 0.24, 0.64, 0.60\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:46:26 compute-0 nova_compute[187243]: 2025-12-02 23:46:26.793 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:46:27 compute-0 nova_compute[187243]: 2025-12-02 23:46:27.301 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:46:27 compute-0 nova_compute[187243]: 2025-12-02 23:46:27.812 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:46:27 compute-0 nova_compute[187243]: 2025-12-02 23:46:27.813 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:46:29 compute-0 podman[197600]: time="2025-12-02T23:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:46:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:46:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2570 "" "Go-http-client/1.1"
Dec 02 23:46:31 compute-0 openstack_network_exporter[199746]: ERROR   23:46:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:46:31 compute-0 openstack_network_exporter[199746]: ERROR   23:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:46:31 compute-0 openstack_network_exporter[199746]: ERROR   23:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:46:31 compute-0 openstack_network_exporter[199746]: ERROR   23:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:46:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:46:31 compute-0 openstack_network_exporter[199746]: ERROR   23:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:46:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:46:37 compute-0 podman[208618]: 2025-12-02 23:46:37.109537321 +0000 UTC m=+0.064174591 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:46:40 compute-0 podman[208643]: 2025-12-02 23:46:40.133505387 +0000 UTC m=+0.079997906 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Dec 02 23:46:40 compute-0 podman[208644]: 2025-12-02 23:46:40.165409533 +0000 UTC m=+0.111294248 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 23:46:52 compute-0 sshd-session[208686]: Invalid user frappe from 23.95.37.90 port 56946
Dec 02 23:46:52 compute-0 sshd-session[208686]: Received disconnect from 23.95.37.90 port 56946:11: Bye Bye [preauth]
Dec 02 23:46:52 compute-0 sshd-session[208686]: Disconnected from invalid user frappe 23.95.37.90 port 56946 [preauth]
Dec 02 23:46:52 compute-0 podman[208688]: 2025-12-02 23:46:52.448201198 +0000 UTC m=+0.060192677 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Dec 02 23:46:57 compute-0 podman[208709]: 2025-12-02 23:46:57.093576706 +0000 UTC m=+0.052838623 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:46:59 compute-0 podman[197600]: time="2025-12-02T23:46:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:46:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:46:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:46:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:46:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2570 "" "Go-http-client/1.1"
Dec 02 23:47:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:00.664 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:47:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:00.665 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:47:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:00.665 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:47:01 compute-0 openstack_network_exporter[199746]: ERROR   23:47:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:47:01 compute-0 openstack_network_exporter[199746]: ERROR   23:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:47:01 compute-0 openstack_network_exporter[199746]: ERROR   23:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:47:01 compute-0 openstack_network_exporter[199746]: ERROR   23:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:47:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:47:01 compute-0 openstack_network_exporter[199746]: ERROR   23:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:47:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:47:07 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:07.447 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:47:07 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:07.448 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:47:07 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:07.450 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:47:08 compute-0 podman[208735]: 2025-12-02 23:47:08.124989309 +0000 UTC m=+0.071829533 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:47:09 compute-0 sshd-session[208759]: Received disconnect from 45.78.222.160 port 52878:11: Bye Bye [preauth]
Dec 02 23:47:09 compute-0 sshd-session[208759]: Disconnected from authenticating user root 45.78.222.160 port 52878 [preauth]
Dec 02 23:47:11 compute-0 podman[208761]: 2025-12-02 23:47:11.130452627 +0000 UTC m=+0.078725186 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Dec 02 23:47:11 compute-0 podman[208762]: 2025-12-02 23:47:11.19091082 +0000 UTC m=+0.132117582 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 02 23:47:12 compute-0 sshd-session[208806]: Invalid user cc from 20.123.120.169 port 58116
Dec 02 23:47:13 compute-0 sshd-session[208806]: Received disconnect from 20.123.120.169 port 58116:11: Bye Bye [preauth]
Dec 02 23:47:13 compute-0 sshd-session[208806]: Disconnected from invalid user cc 20.123.120.169 port 58116 [preauth]
Dec 02 23:47:13 compute-0 sshd-session[208733]: Connection closed by 45.78.218.154 port 58146 [preauth]
Dec 02 23:47:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:14.907 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:09:67 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-17f7ebac-3ad4-4ff9-8a59-57891d829652', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17f7ebac-3ad4-4ff9-8a59-57891d829652', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22106c97f2524355a0bbadb98eaf5c22', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13808bcd-e156-4466-92c4-34dec905a236, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=592edbb1-0110-4221-9f00-b76b4034be4b) old=Port_Binding(mac=['fa:16:3e:0d:09:67'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-17f7ebac-3ad4-4ff9-8a59-57891d829652', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17f7ebac-3ad4-4ff9-8a59-57891d829652', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22106c97f2524355a0bbadb98eaf5c22', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:47:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:14.908 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 592edbb1-0110-4221-9f00-b76b4034be4b in datapath 17f7ebac-3ad4-4ff9-8a59-57891d829652 updated
Dec 02 23:47:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:14.909 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17f7ebac-3ad4-4ff9-8a59-57891d829652, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:47:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:14.910 104379 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp03nkwgnt/privsep.sock']
Dec 02 23:47:15 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:15.599 104379 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 02 23:47:15 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:15.599 104379 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp03nkwgnt/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Dec 02 23:47:15 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:15.447 208813 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 23:47:15 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:15.451 208813 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 23:47:15 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:15.453 208813 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Dec 02 23:47:15 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:15.453 208813 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208813
Dec 02 23:47:15 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:15.601 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[21a8d9c0-4428-42c7-a875-fe39ed0619ea]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:47:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:16.035 208813 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:47:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:16.035 208813 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:47:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:16.035 208813 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:47:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:16.465 208813 INFO oslo_service.backend [-] Loading backend: eventlet
Dec 02 23:47:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:16.471 208813 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Dec 02 23:47:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:47:16.505 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8fb8b03c-7ef0-4ed8-99c9-78a8d702e1f5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:47:17 compute-0 nova_compute[187243]: 2025-12-02 23:47:17.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:17 compute-0 nova_compute[187243]: 2025-12-02 23:47:17.593 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 02 23:47:18 compute-0 nova_compute[187243]: 2025-12-02 23:47:18.115 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 02 23:47:18 compute-0 nova_compute[187243]: 2025-12-02 23:47:18.116 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:18 compute-0 nova_compute[187243]: 2025-12-02 23:47:18.116 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 02 23:47:18 compute-0 nova_compute[187243]: 2025-12-02 23:47:18.625 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:21 compute-0 nova_compute[187243]: 2025-12-02 23:47:21.131 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:21 compute-0 nova_compute[187243]: 2025-12-02 23:47:21.132 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:21 compute-0 nova_compute[187243]: 2025-12-02 23:47:21.132 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:47:21 compute-0 nova_compute[187243]: 2025-12-02 23:47:21.594 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:21 compute-0 nova_compute[187243]: 2025-12-02 23:47:21.594 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:22 compute-0 nova_compute[187243]: 2025-12-02 23:47:22.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:22 compute-0 nova_compute[187243]: 2025-12-02 23:47:22.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:22 compute-0 nova_compute[187243]: 2025-12-02 23:47:22.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:23 compute-0 nova_compute[187243]: 2025-12-02 23:47:23.112 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:47:23 compute-0 nova_compute[187243]: 2025-12-02 23:47:23.112 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:47:23 compute-0 nova_compute[187243]: 2025-12-02 23:47:23.113 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:47:23 compute-0 nova_compute[187243]: 2025-12-02 23:47:23.113 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:47:23 compute-0 podman[208818]: 2025-12-02 23:47:23.137875696 +0000 UTC m=+0.091256383 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 23:47:23 compute-0 nova_compute[187243]: 2025-12-02 23:47:23.282 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:47:23 compute-0 nova_compute[187243]: 2025-12-02 23:47:23.284 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:47:23 compute-0 nova_compute[187243]: 2025-12-02 23:47:23.301 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:47:23 compute-0 nova_compute[187243]: 2025-12-02 23:47:23.302 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5995MB free_disk=73.20328903198242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:47:23 compute-0 nova_compute[187243]: 2025-12-02 23:47:23.302 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:47:23 compute-0 nova_compute[187243]: 2025-12-02 23:47:23.302 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:47:24 compute-0 nova_compute[187243]: 2025-12-02 23:47:24.399 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:47:24 compute-0 nova_compute[187243]: 2025-12-02 23:47:24.400 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:47:23 up 55 min,  0 user,  load average: 0.24, 0.56, 0.58\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:47:24 compute-0 nova_compute[187243]: 2025-12-02 23:47:24.429 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:47:24 compute-0 nova_compute[187243]: 2025-12-02 23:47:24.938 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:47:25 compute-0 nova_compute[187243]: 2025-12-02 23:47:25.449 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:47:25 compute-0 nova_compute[187243]: 2025-12-02 23:47:25.450 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.147s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:47:26 compute-0 nova_compute[187243]: 2025-12-02 23:47:26.450 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:47:28 compute-0 podman[208841]: 2025-12-02 23:47:28.124161713 +0000 UTC m=+0.069324244 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4)
Dec 02 23:47:29 compute-0 podman[197600]: time="2025-12-02T23:47:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:47:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:47:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:47:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:47:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2565 "" "Go-http-client/1.1"
Dec 02 23:47:31 compute-0 openstack_network_exporter[199746]: ERROR   23:47:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:47:31 compute-0 openstack_network_exporter[199746]: ERROR   23:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:47:31 compute-0 openstack_network_exporter[199746]: ERROR   23:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:47:31 compute-0 openstack_network_exporter[199746]: ERROR   23:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:47:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:47:31 compute-0 openstack_network_exporter[199746]: ERROR   23:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:47:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:47:39 compute-0 podman[208863]: 2025-12-02 23:47:39.096399861 +0000 UTC m=+0.055081475 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:47:42 compute-0 podman[208887]: 2025-12-02 23:47:42.146942108 +0000 UTC m=+0.096660828 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Dec 02 23:47:42 compute-0 podman[208888]: 2025-12-02 23:47:42.181257185 +0000 UTC m=+0.124813934 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 02 23:47:47 compute-0 sshd-session[208862]: error: kex_exchange_identification: read: Connection timed out
Dec 02 23:47:47 compute-0 sshd-session[208862]: banner exchange: Connection from 120.48.147.81 port 36104: Connection timed out
Dec 02 23:47:49 compute-0 sshd-session[208934]: Invalid user mark from 49.247.36.49 port 27097
Dec 02 23:47:49 compute-0 sshd-session[208934]: Received disconnect from 49.247.36.49 port 27097:11: Bye Bye [preauth]
Dec 02 23:47:49 compute-0 sshd-session[208934]: Disconnected from invalid user mark 49.247.36.49 port 27097 [preauth]
Dec 02 23:47:54 compute-0 podman[208936]: 2025-12-02 23:47:54.153150989 +0000 UTC m=+0.090455916 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 23:47:59 compute-0 podman[208957]: 2025-12-02 23:47:59.142254864 +0000 UTC m=+0.090528488 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Dec 02 23:47:59 compute-0 podman[197600]: time="2025-12-02T23:47:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:47:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:47:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:47:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:47:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2569 "" "Go-http-client/1.1"
Dec 02 23:48:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:48:00.666 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:48:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:48:00.666 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:48:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:48:00.667 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:48:01 compute-0 openstack_network_exporter[199746]: ERROR   23:48:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:48:01 compute-0 openstack_network_exporter[199746]: ERROR   23:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:48:01 compute-0 openstack_network_exporter[199746]: ERROR   23:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:48:01 compute-0 openstack_network_exporter[199746]: ERROR   23:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:48:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:48:01 compute-0 openstack_network_exporter[199746]: ERROR   23:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:48:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:48:10 compute-0 podman[208978]: 2025-12-02 23:48:10.115231735 +0000 UTC m=+0.076494536 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:48:13 compute-0 podman[209003]: 2025-12-02 23:48:13.136695934 +0000 UTC m=+0.082638027 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202)
Dec 02 23:48:13 compute-0 podman[209004]: 2025-12-02 23:48:13.196991407 +0000 UTC m=+0.136913812 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Dec 02 23:48:19 compute-0 nova_compute[187243]: 2025-12-02 23:48:19.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:19 compute-0 nova_compute[187243]: 2025-12-02 23:48:19.593 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:48:20 compute-0 sshd-session[209050]: Received disconnect from 117.5.148.56 port 53376:11:  [preauth]
Dec 02 23:48:20 compute-0 sshd-session[209050]: Disconnected from authenticating user root 117.5.148.56 port 53376 [preauth]
Dec 02 23:48:21 compute-0 nova_compute[187243]: 2025-12-02 23:48:21.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:22 compute-0 sshd-session[209048]: Invalid user azureuser from 45.78.219.213 port 45340
Dec 02 23:48:22 compute-0 sshd-session[209048]: Received disconnect from 45.78.219.213 port 45340:11: Bye Bye [preauth]
Dec 02 23:48:22 compute-0 sshd-session[209048]: Disconnected from invalid user azureuser 45.78.219.213 port 45340 [preauth]
Dec 02 23:48:22 compute-0 nova_compute[187243]: 2025-12-02 23:48:22.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:22 compute-0 nova_compute[187243]: 2025-12-02 23:48:22.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:23 compute-0 nova_compute[187243]: 2025-12-02 23:48:23.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:24 compute-0 nova_compute[187243]: 2025-12-02 23:48:24.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:24 compute-0 nova_compute[187243]: 2025-12-02 23:48:24.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:25 compute-0 nova_compute[187243]: 2025-12-02 23:48:25.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:48:25 compute-0 nova_compute[187243]: 2025-12-02 23:48:25.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:48:25 compute-0 nova_compute[187243]: 2025-12-02 23:48:25.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:48:25 compute-0 nova_compute[187243]: 2025-12-02 23:48:25.105 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:48:25 compute-0 podman[209052]: 2025-12-02 23:48:25.107714157 +0000 UTC m=+0.071217525 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Dec 02 23:48:25 compute-0 nova_compute[187243]: 2025-12-02 23:48:25.245 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:48:25 compute-0 nova_compute[187243]: 2025-12-02 23:48:25.246 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:48:25 compute-0 nova_compute[187243]: 2025-12-02 23:48:25.265 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:48:25 compute-0 nova_compute[187243]: 2025-12-02 23:48:25.265 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6027MB free_disk=73.20330810546875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:48:25 compute-0 nova_compute[187243]: 2025-12-02 23:48:25.266 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:48:25 compute-0 nova_compute[187243]: 2025-12-02 23:48:25.266 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:48:26 compute-0 nova_compute[187243]: 2025-12-02 23:48:26.342 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:48:26 compute-0 nova_compute[187243]: 2025-12-02 23:48:26.343 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:48:25 up 56 min,  0 user,  load average: 0.22, 0.48, 0.55\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:48:26 compute-0 nova_compute[187243]: 2025-12-02 23:48:26.396 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing inventories for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 02 23:48:26 compute-0 nova_compute[187243]: 2025-12-02 23:48:26.450 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating ProviderTree inventory for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 02 23:48:26 compute-0 nova_compute[187243]: 2025-12-02 23:48:26.450 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:48:26 compute-0 nova_compute[187243]: 2025-12-02 23:48:26.466 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing aggregate associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 02 23:48:26 compute-0 nova_compute[187243]: 2025-12-02 23:48:26.495 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing trait associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_SOUND_MODEL_AC97,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOUND_MODEL_ICH9,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 02 23:48:26 compute-0 nova_compute[187243]: 2025-12-02 23:48:26.518 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:48:27 compute-0 nova_compute[187243]: 2025-12-02 23:48:27.028 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:48:27 compute-0 nova_compute[187243]: 2025-12-02 23:48:27.536 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:48:27 compute-0 nova_compute[187243]: 2025-12-02 23:48:27.536 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.270s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:48:28 compute-0 nova_compute[187243]: 2025-12-02 23:48:28.533 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:29 compute-0 nova_compute[187243]: 2025-12-02 23:48:29.042 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:48:29 compute-0 podman[197600]: time="2025-12-02T23:48:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:48:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:48:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:48:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:48:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2574 "" "Go-http-client/1.1"
Dec 02 23:48:30 compute-0 podman[209074]: 2025-12-02 23:48:30.112293246 +0000 UTC m=+0.067136891 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:48:31 compute-0 openstack_network_exporter[199746]: ERROR   23:48:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:48:31 compute-0 openstack_network_exporter[199746]: ERROR   23:48:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:48:31 compute-0 openstack_network_exporter[199746]: ERROR   23:48:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:48:31 compute-0 openstack_network_exporter[199746]: ERROR   23:48:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:48:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:48:31 compute-0 openstack_network_exporter[199746]: ERROR   23:48:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:48:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:48:41 compute-0 podman[209095]: 2025-12-02 23:48:41.125134279 +0000 UTC m=+0.078588326 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:48:44 compute-0 podman[209119]: 2025-12-02 23:48:44.106936555 +0000 UTC m=+0.067039118 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Dec 02 23:48:44 compute-0 podman[209120]: 2025-12-02 23:48:44.170390605 +0000 UTC m=+0.115956837 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Dec 02 23:48:51 compute-0 sshd-session[209166]: Received disconnect from 45.78.219.95 port 41902:11: Bye Bye [preauth]
Dec 02 23:48:51 compute-0 sshd-session[209166]: Disconnected from authenticating user root 45.78.219.95 port 41902 [preauth]
Dec 02 23:48:56 compute-0 podman[209169]: 2025-12-02 23:48:56.107854246 +0000 UTC m=+0.065135411 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Dec 02 23:48:59 compute-0 podman[197600]: time="2025-12-02T23:48:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:48:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:48:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:48:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:48:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Dec 02 23:49:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:49:00.667 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:49:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:49:00.668 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:49:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:49:00.668 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:49:01 compute-0 podman[209192]: 2025-12-02 23:49:01.110358155 +0000 UTC m=+0.065397868 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 23:49:01 compute-0 openstack_network_exporter[199746]: ERROR   23:49:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:49:01 compute-0 openstack_network_exporter[199746]: ERROR   23:49:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:49:01 compute-0 openstack_network_exporter[199746]: ERROR   23:49:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:49:01 compute-0 openstack_network_exporter[199746]: ERROR   23:49:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:49:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:49:01 compute-0 openstack_network_exporter[199746]: ERROR   23:49:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:49:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:49:04 compute-0 sshd-session[209212]: Invalid user user1 from 23.95.37.90 port 46472
Dec 02 23:49:04 compute-0 sshd-session[209212]: Received disconnect from 23.95.37.90 port 46472:11: Bye Bye [preauth]
Dec 02 23:49:04 compute-0 sshd-session[209212]: Disconnected from invalid user user1 23.95.37.90 port 46472 [preauth]
Dec 02 23:49:06 compute-0 sshd-session[209214]: error: kex_exchange_identification: read: Connection reset by peer
Dec 02 23:49:06 compute-0 sshd-session[209214]: Connection reset by 101.47.140.127 port 46038
Dec 02 23:49:12 compute-0 podman[209215]: 2025-12-02 23:49:12.126917062 +0000 UTC m=+0.068802900 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:49:15 compute-0 podman[209238]: 2025-12-02 23:49:15.109640901 +0000 UTC m=+0.057704262 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 02 23:49:15 compute-0 podman[209239]: 2025-12-02 23:49:15.172059686 +0000 UTC m=+0.122038704 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 02 23:49:19 compute-0 sshd-session[209281]: Invalid user nrk from 20.123.120.169 port 57166
Dec 02 23:49:19 compute-0 sshd-session[209281]: Received disconnect from 20.123.120.169 port 57166:11: Bye Bye [preauth]
Dec 02 23:49:19 compute-0 sshd-session[209281]: Disconnected from invalid user nrk 20.123.120.169 port 57166 [preauth]
Dec 02 23:49:19 compute-0 nova_compute[187243]: 2025-12-02 23:49:19.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:19 compute-0 nova_compute[187243]: 2025-12-02 23:49:19.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:49:19 compute-0 sshd-session[209283]: Invalid user student1 from 49.247.36.49 port 52221
Dec 02 23:49:20 compute-0 sshd-session[209283]: Received disconnect from 49.247.36.49 port 52221:11: Bye Bye [preauth]
Dec 02 23:49:20 compute-0 sshd-session[209283]: Disconnected from invalid user student1 49.247.36.49 port 52221 [preauth]
Dec 02 23:49:21 compute-0 nova_compute[187243]: 2025-12-02 23:49:21.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:22 compute-0 nova_compute[187243]: 2025-12-02 23:49:22.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:23 compute-0 nova_compute[187243]: 2025-12-02 23:49:23.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:23 compute-0 nova_compute[187243]: 2025-12-02 23:49:23.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:24 compute-0 nova_compute[187243]: 2025-12-02 23:49:24.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:25 compute-0 nova_compute[187243]: 2025-12-02 23:49:25.108 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:49:25 compute-0 nova_compute[187243]: 2025-12-02 23:49:25.108 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:49:25 compute-0 nova_compute[187243]: 2025-12-02 23:49:25.108 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:49:25 compute-0 nova_compute[187243]: 2025-12-02 23:49:25.108 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:49:25 compute-0 nova_compute[187243]: 2025-12-02 23:49:25.258 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:49:25 compute-0 nova_compute[187243]: 2025-12-02 23:49:25.260 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:49:25 compute-0 nova_compute[187243]: 2025-12-02 23:49:25.276 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:49:25 compute-0 nova_compute[187243]: 2025-12-02 23:49:25.277 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6050MB free_disk=73.20378875732422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:49:25 compute-0 nova_compute[187243]: 2025-12-02 23:49:25.277 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:49:25 compute-0 nova_compute[187243]: 2025-12-02 23:49:25.277 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:49:26 compute-0 nova_compute[187243]: 2025-12-02 23:49:26.482 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:49:26 compute-0 nova_compute[187243]: 2025-12-02 23:49:26.483 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:49:25 up 57 min,  0 user,  load average: 0.25, 0.44, 0.53\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:49:26 compute-0 nova_compute[187243]: 2025-12-02 23:49:26.506 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:49:27 compute-0 nova_compute[187243]: 2025-12-02 23:49:27.014 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:49:27 compute-0 podman[209287]: 2025-12-02 23:49:27.100390327 +0000 UTC m=+0.054346201 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 23:49:27 compute-0 nova_compute[187243]: 2025-12-02 23:49:27.526 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:49:27 compute-0 nova_compute[187243]: 2025-12-02 23:49:27.526 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.249s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:49:28 compute-0 nova_compute[187243]: 2025-12-02 23:49:28.527 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:28 compute-0 nova_compute[187243]: 2025-12-02 23:49:28.527 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:49:29 compute-0 podman[197600]: time="2025-12-02T23:49:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:49:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:49:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:49:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:49:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2572 "" "Go-http-client/1.1"
Dec 02 23:49:31 compute-0 openstack_network_exporter[199746]: ERROR   23:49:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:49:31 compute-0 openstack_network_exporter[199746]: ERROR   23:49:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:49:31 compute-0 openstack_network_exporter[199746]: ERROR   23:49:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:49:31 compute-0 openstack_network_exporter[199746]: ERROR   23:49:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:49:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:49:31 compute-0 openstack_network_exporter[199746]: ERROR   23:49:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:49:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:49:32 compute-0 podman[209309]: 2025-12-02 23:49:32.141391463 +0000 UTC m=+0.093161277 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:49:42 compute-0 sshd-session[209330]: Invalid user deploy from 61.220.235.10 port 40296
Dec 02 23:49:43 compute-0 podman[209332]: 2025-12-02 23:49:43.078912244 +0000 UTC m=+0.083473324 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:49:43 compute-0 sshd-session[209330]: Received disconnect from 61.220.235.10 port 40296:11: Bye Bye [preauth]
Dec 02 23:49:43 compute-0 sshd-session[209330]: Disconnected from invalid user deploy 61.220.235.10 port 40296 [preauth]
Dec 02 23:49:45 compute-0 sshd-session[209356]: Received disconnect from 102.210.148.92 port 55660:11: Bye Bye [preauth]
Dec 02 23:49:45 compute-0 sshd-session[209356]: Disconnected from authenticating user root 102.210.148.92 port 55660 [preauth]
Dec 02 23:49:46 compute-0 podman[209358]: 2025-12-02 23:49:46.131413445 +0000 UTC m=+0.080228765 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 23:49:46 compute-0 podman[209359]: 2025-12-02 23:49:46.184624578 +0000 UTC m=+0.119270627 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:49:55 compute-0 sshd-session[209286]: error: kex_exchange_identification: read: Connection timed out
Dec 02 23:49:55 compute-0 sshd-session[209286]: banner exchange: Connection from 45.78.218.154 port 50672: Connection timed out
Dec 02 23:49:58 compute-0 podman[209404]: 2025-12-02 23:49:58.1519911 +0000 UTC m=+0.095996206 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=)
Dec 02 23:49:59 compute-0 podman[197600]: time="2025-12-02T23:49:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:49:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:49:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:49:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:49:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2569 "" "Go-http-client/1.1"
Dec 02 23:50:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:00.669 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:50:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:00.669 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:50:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:00.669 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:50:01 compute-0 openstack_network_exporter[199746]: ERROR   23:50:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:50:01 compute-0 openstack_network_exporter[199746]: ERROR   23:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:50:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:50:01 compute-0 openstack_network_exporter[199746]: ERROR   23:50:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:50:01 compute-0 openstack_network_exporter[199746]: ERROR   23:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:50:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:50:01 compute-0 openstack_network_exporter[199746]: ERROR   23:50:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:50:03 compute-0 podman[209428]: 2025-12-02 23:50:03.10275383 +0000 UTC m=+0.063943173 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 02 23:50:11 compute-0 sshd-session[209448]: Invalid user admin1 from 23.95.37.90 port 58254
Dec 02 23:50:11 compute-0 sshd-session[209448]: Received disconnect from 23.95.37.90 port 58254:11: Bye Bye [preauth]
Dec 02 23:50:11 compute-0 sshd-session[209448]: Disconnected from invalid user admin1 23.95.37.90 port 58254 [preauth]
Dec 02 23:50:12 compute-0 sshd-session[207483]: Received disconnect from 38.102.83.66 port 42104:11: disconnected by user
Dec 02 23:50:12 compute-0 sshd-session[207483]: Disconnected from user zuul 38.102.83.66 port 42104
Dec 02 23:50:12 compute-0 sshd-session[207478]: pam_unix(sshd:session): session closed for user zuul
Dec 02 23:50:12 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Dec 02 23:50:12 compute-0 systemd[1]: session-27.scope: Consumed 7.306s CPU time.
Dec 02 23:50:12 compute-0 systemd-logind[795]: Session 27 logged out. Waiting for processes to exit.
Dec 02 23:50:12 compute-0 systemd-logind[795]: Removed session 27.
Dec 02 23:50:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:14.079 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:50:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:14.080 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:50:14 compute-0 podman[209450]: 2025-12-02 23:50:14.144216597 +0000 UTC m=+0.087822268 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:50:15 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:15.048 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:7d:48 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68b76ae8150c43ac98862da676697b95', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d26e027-073c-46dd-95e9-4c77fc749b25, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7b376433-fabc-48f0-aa10-0098b8d1cf58) old=Port_Binding(mac=['fa:16:3e:9a:7d:48'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68b76ae8150c43ac98862da676697b95', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:50:15 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:15.048 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7b376433-fabc-48f0-aa10-0098b8d1cf58 in datapath 8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522 updated
Dec 02 23:50:15 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:15.050 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:50:15 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:15.051 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b13f2fae-d719-472a-879e-b67d698f6ba7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:50:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:16.081 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:50:17 compute-0 podman[209475]: 2025-12-02 23:50:17.125505591 +0000 UTC m=+0.077755806 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:50:17 compute-0 podman[209476]: 2025-12-02 23:50:17.184682778 +0000 UTC m=+0.142287032 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 02 23:50:21 compute-0 nova_compute[187243]: 2025-12-02 23:50:21.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:21 compute-0 nova_compute[187243]: 2025-12-02 23:50:21.594 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:50:22 compute-0 nova_compute[187243]: 2025-12-02 23:50:22.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:23 compute-0 nova_compute[187243]: 2025-12-02 23:50:23.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:23 compute-0 nova_compute[187243]: 2025-12-02 23:50:23.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:24 compute-0 sshd-session[209518]: Invalid user user1 from 20.123.120.169 port 37116
Dec 02 23:50:24 compute-0 sshd-session[209518]: Received disconnect from 20.123.120.169 port 37116:11: Bye Bye [preauth]
Dec 02 23:50:24 compute-0 sshd-session[209518]: Disconnected from invalid user user1 20.123.120.169 port 37116 [preauth]
Dec 02 23:50:25 compute-0 nova_compute[187243]: 2025-12-02 23:50:25.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:26 compute-0 nova_compute[187243]: 2025-12-02 23:50:26.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:27 compute-0 nova_compute[187243]: 2025-12-02 23:50:27.113 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:50:27 compute-0 nova_compute[187243]: 2025-12-02 23:50:27.114 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:50:27 compute-0 nova_compute[187243]: 2025-12-02 23:50:27.114 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:50:27 compute-0 nova_compute[187243]: 2025-12-02 23:50:27.114 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:50:27 compute-0 nova_compute[187243]: 2025-12-02 23:50:27.299 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:50:27 compute-0 nova_compute[187243]: 2025-12-02 23:50:27.299 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:50:27 compute-0 nova_compute[187243]: 2025-12-02 23:50:27.314 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:50:27 compute-0 nova_compute[187243]: 2025-12-02 23:50:27.315 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6059MB free_disk=73.20378875732422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:50:27 compute-0 nova_compute[187243]: 2025-12-02 23:50:27.315 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:50:27 compute-0 nova_compute[187243]: 2025-12-02 23:50:27.315 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:50:27 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:27.643 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:fa:1b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0bd9450f-3ee4-4a30-8a14-b8735bd45c3e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0bd9450f-3ee4-4a30-8a14-b8735bd45c3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aad1654ac0c43c38292ab72dec9fb3a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=030d3419-d041-4d8e-8886-4428ecbcc3b5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=01730371-dcdb-43cc-98fa-0362fa6b15a8) old=Port_Binding(mac=['fa:16:3e:ea:fa:1b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-0bd9450f-3ee4-4a30-8a14-b8735bd45c3e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0bd9450f-3ee4-4a30-8a14-b8735bd45c3e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aad1654ac0c43c38292ab72dec9fb3a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:50:27 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:27.644 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 01730371-dcdb-43cc-98fa-0362fa6b15a8 in datapath 0bd9450f-3ee4-4a30-8a14-b8735bd45c3e updated
Dec 02 23:50:27 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:27.646 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0bd9450f-3ee4-4a30-8a14-b8735bd45c3e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:50:27 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:50:27.647 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c4274d7e-d89b-4deb-be90-2711327f3a84]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:50:28 compute-0 nova_compute[187243]: 2025-12-02 23:50:28.364 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:50:28 compute-0 nova_compute[187243]: 2025-12-02 23:50:28.365 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:50:27 up 58 min,  0 user,  load average: 0.09, 0.36, 0.49\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:50:28 compute-0 nova_compute[187243]: 2025-12-02 23:50:28.392 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:50:28 compute-0 nova_compute[187243]: 2025-12-02 23:50:28.904 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:50:29 compute-0 podman[209521]: 2025-12-02 23:50:29.138013123 +0000 UTC m=+0.086722472 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Dec 02 23:50:29 compute-0 nova_compute[187243]: 2025-12-02 23:50:29.413 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:50:29 compute-0 nova_compute[187243]: 2025-12-02 23:50:29.414 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:50:29 compute-0 podman[197600]: time="2025-12-02T23:50:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:50:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:50:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:50:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:50:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Dec 02 23:50:30 compute-0 nova_compute[187243]: 2025-12-02 23:50:30.414 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:30 compute-0 nova_compute[187243]: 2025-12-02 23:50:30.414 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:31 compute-0 nova_compute[187243]: 2025-12-02 23:50:31.243 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:50:31 compute-0 openstack_network_exporter[199746]: ERROR   23:50:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:50:31 compute-0 openstack_network_exporter[199746]: ERROR   23:50:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:50:31 compute-0 openstack_network_exporter[199746]: ERROR   23:50:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:50:31 compute-0 openstack_network_exporter[199746]: ERROR   23:50:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:50:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:50:31 compute-0 openstack_network_exporter[199746]: ERROR   23:50:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:50:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:50:34 compute-0 podman[209543]: 2025-12-02 23:50:34.16408886 +0000 UTC m=+0.110656269 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:50:45 compute-0 podman[209564]: 2025-12-02 23:50:45.166928353 +0000 UTC m=+0.114281964 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:50:48 compute-0 podman[209590]: 2025-12-02 23:50:48.116182846 +0000 UTC m=+0.064048462 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:50:48 compute-0 podman[209591]: 2025-12-02 23:50:48.237983045 +0000 UTC m=+0.172043607 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Dec 02 23:50:49 compute-0 sshd-session[209630]: Received disconnect from 49.247.36.49 port 47710:11: Bye Bye [preauth]
Dec 02 23:50:49 compute-0 sshd-session[209630]: Disconnected from authenticating user root 49.247.36.49 port 47710 [preauth]
Dec 02 23:50:54 compute-0 nova_compute[187243]: 2025-12-02 23:50:54.749 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Acquiring lock "c6ff891f-7953-444d-9ebf-df71ec387311" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:50:54 compute-0 nova_compute[187243]: 2025-12-02 23:50:54.749 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:50:55 compute-0 nova_compute[187243]: 2025-12-02 23:50:55.258 187247 DEBUG nova.compute.manager [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 02 23:50:55 compute-0 nova_compute[187243]: 2025-12-02 23:50:55.915 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:50:55 compute-0 nova_compute[187243]: 2025-12-02 23:50:55.916 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:50:55 compute-0 nova_compute[187243]: 2025-12-02 23:50:55.924 187247 DEBUG nova.virt.hardware [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 02 23:50:55 compute-0 nova_compute[187243]: 2025-12-02 23:50:55.925 187247 INFO nova.compute.claims [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Claim successful on node compute-0.ctlplane.example.com
Dec 02 23:50:56 compute-0 sshd-session[209638]: Received disconnect from 45.78.219.213 port 33590:11: Bye Bye [preauth]
Dec 02 23:50:56 compute-0 sshd-session[209638]: Disconnected from authenticating user root 45.78.219.213 port 33590 [preauth]
Dec 02 23:50:57 compute-0 nova_compute[187243]: 2025-12-02 23:50:57.011 187247 DEBUG nova.compute.provider_tree [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:50:57 compute-0 nova_compute[187243]: 2025-12-02 23:50:57.522 187247 DEBUG nova.scheduler.client.report [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:50:58 compute-0 nova_compute[187243]: 2025-12-02 23:50:58.034 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:50:58 compute-0 nova_compute[187243]: 2025-12-02 23:50:58.037 187247 DEBUG nova.compute.manager [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 02 23:50:58 compute-0 nova_compute[187243]: 2025-12-02 23:50:58.554 187247 DEBUG nova.compute.manager [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 02 23:50:58 compute-0 nova_compute[187243]: 2025-12-02 23:50:58.555 187247 DEBUG nova.network.neutron [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 02 23:50:58 compute-0 nova_compute[187243]: 2025-12-02 23:50:58.556 187247 WARNING neutronclient.v2_0.client [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:50:58 compute-0 nova_compute[187243]: 2025-12-02 23:50:58.559 187247 WARNING neutronclient.v2_0.client [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:50:59 compute-0 nova_compute[187243]: 2025-12-02 23:50:59.070 187247 INFO nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 02 23:50:59 compute-0 nova_compute[187243]: 2025-12-02 23:50:59.580 187247 DEBUG nova.compute.manager [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 02 23:50:59 compute-0 podman[197600]: time="2025-12-02T23:50:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:50:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:50:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:50:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:50:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2573 "" "Go-http-client/1.1"
Dec 02 23:51:00 compute-0 podman[209643]: 2025-12-02 23:51:00.143000549 +0000 UTC m=+0.089103598 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Dec 02 23:51:00 compute-0 nova_compute[187243]: 2025-12-02 23:51:00.179 187247 DEBUG nova.network.neutron [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Successfully created port: b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 02 23:51:00 compute-0 sshd-session[209641]: Invalid user vncuser from 102.210.148.92 port 55646
Dec 02 23:51:00 compute-0 nova_compute[187243]: 2025-12-02 23:51:00.600 187247 DEBUG nova.compute.manager [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 02 23:51:00 compute-0 nova_compute[187243]: 2025-12-02 23:51:00.602 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 02 23:51:00 compute-0 nova_compute[187243]: 2025-12-02 23:51:00.602 187247 INFO nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Creating image(s)
Dec 02 23:51:00 compute-0 nova_compute[187243]: 2025-12-02 23:51:00.603 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Acquiring lock "/var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:00 compute-0 nova_compute[187243]: 2025-12-02 23:51:00.603 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "/var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:00 compute-0 nova_compute[187243]: 2025-12-02 23:51:00.604 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "/var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:00 compute-0 nova_compute[187243]: 2025-12-02 23:51:00.605 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:00 compute-0 nova_compute[187243]: 2025-12-02 23:51:00.606 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:00 compute-0 sshd-session[209641]: Received disconnect from 102.210.148.92 port 55646:11: Bye Bye [preauth]
Dec 02 23:51:00 compute-0 sshd-session[209641]: Disconnected from invalid user vncuser 102.210.148.92 port 55646 [preauth]
Dec 02 23:51:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:00.670 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:00.671 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:00.671 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:01 compute-0 openstack_network_exporter[199746]: ERROR   23:51:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:51:01 compute-0 openstack_network_exporter[199746]: ERROR   23:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:51:01 compute-0 openstack_network_exporter[199746]: ERROR   23:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:51:01 compute-0 openstack_network_exporter[199746]: ERROR   23:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:51:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:51:01 compute-0 openstack_network_exporter[199746]: ERROR   23:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:51:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:51:01 compute-0 nova_compute[187243]: 2025-12-02 23:51:01.933 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:51:01 compute-0 nova_compute[187243]: 2025-12-02 23:51:01.936 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:51:01 compute-0 nova_compute[187243]: 2025-12-02 23:51:01.936 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.020 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.part --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.021 187247 DEBUG nova.virt.images [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] 92e79321-71af-44a0-869c-1d5a9da5fefc was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.022 187247 DEBUG nova.privsep.utils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.022 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.part /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.030 187247 DEBUG nova.network.neutron [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Successfully updated port: b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.249 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.part /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.converted" returned: 0 in 0.226s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.253 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.338 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0.converted --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.339 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.734s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.340 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.343 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.344 187247 INFO oslo.privsep.daemon [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmptf2wiyux/privsep.sock']
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.445 187247 DEBUG nova.compute.manager [req-ce4b169a-20eb-43a2-8b32-3357a11a94d2 req-fe42f0a7-20bd-451c-a0f0-193903211a83 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Received event network-changed-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.445 187247 DEBUG nova.compute.manager [req-ce4b169a-20eb-43a2-8b32-3357a11a94d2 req-fe42f0a7-20bd-451c-a0f0-193903211a83 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Refreshing instance network info cache due to event network-changed-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.446 187247 DEBUG oslo_concurrency.lockutils [req-ce4b169a-20eb-43a2-8b32-3357a11a94d2 req-fe42f0a7-20bd-451c-a0f0-193903211a83 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-c6ff891f-7953-444d-9ebf-df71ec387311" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.446 187247 DEBUG oslo_concurrency.lockutils [req-ce4b169a-20eb-43a2-8b32-3357a11a94d2 req-fe42f0a7-20bd-451c-a0f0-193903211a83 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-c6ff891f-7953-444d-9ebf-df71ec387311" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.446 187247 DEBUG nova.network.neutron [req-ce4b169a-20eb-43a2-8b32-3357a11a94d2 req-fe42f0a7-20bd-451c-a0f0-193903211a83 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Refreshing network info cache for port b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.538 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Acquiring lock "refresh_cache-c6ff891f-7953-444d-9ebf-df71ec387311" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:51:02 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.953 187247 WARNING neutronclient.v2_0.client [req-ce4b169a-20eb-43a2-8b32-3357a11a94d2 req-fe42f0a7-20bd-451c-a0f0-193903211a83 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.105 187247 INFO oslo.privsep.daemon [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Spawned new privsep daemon via rootwrap
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.949 209684 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.953 209684 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.955 209684 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:02.956 209684 INFO oslo.privsep.daemon [-] privsep daemon running as pid 209684
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.226 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.297 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.298 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.299 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.300 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.304 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.304 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.365 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.367 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.405 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.407 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.407 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.505 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.506 187247 DEBUG nova.virt.disk.api [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Checking if we can resize image /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.506 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.576 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.577 187247 DEBUG nova.virt.disk.api [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Cannot resize image /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.578 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.578 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Ensure instance console log exists: /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.578 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.579 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:03 compute-0 nova_compute[187243]: 2025-12-02 23:51:03.579 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:04 compute-0 nova_compute[187243]: 2025-12-02 23:51:04.007 187247 DEBUG nova.network.neutron [req-ce4b169a-20eb-43a2-8b32-3357a11a94d2 req-fe42f0a7-20bd-451c-a0f0-193903211a83 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:51:04 compute-0 nova_compute[187243]: 2025-12-02 23:51:04.255 187247 DEBUG nova.network.neutron [req-ce4b169a-20eb-43a2-8b32-3357a11a94d2 req-fe42f0a7-20bd-451c-a0f0-193903211a83 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:51:04 compute-0 nova_compute[187243]: 2025-12-02 23:51:04.766 187247 DEBUG oslo_concurrency.lockutils [req-ce4b169a-20eb-43a2-8b32-3357a11a94d2 req-fe42f0a7-20bd-451c-a0f0-193903211a83 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-c6ff891f-7953-444d-9ebf-df71ec387311" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:51:04 compute-0 nova_compute[187243]: 2025-12-02 23:51:04.767 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Acquired lock "refresh_cache-c6ff891f-7953-444d-9ebf-df71ec387311" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:51:04 compute-0 nova_compute[187243]: 2025-12-02 23:51:04.767 187247 DEBUG nova.network.neutron [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:51:05 compute-0 podman[209701]: 2025-12-02 23:51:05.164201326 +0000 UTC m=+0.100425263 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:51:05 compute-0 nova_compute[187243]: 2025-12-02 23:51:05.994 187247 DEBUG nova.network.neutron [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:51:06 compute-0 nova_compute[187243]: 2025-12-02 23:51:06.326 187247 WARNING neutronclient.v2_0.client [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.101 187247 DEBUG nova.network.neutron [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Updating instance_info_cache with network_info: [{"id": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "address": "fa:16:3e:be:d2:e8", "network": {"id": "8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1757933674-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b76ae8150c43ac98862da676697b95", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3adaf8b-3f", "ovs_interfaceid": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.610 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Releasing lock "refresh_cache-c6ff891f-7953-444d-9ebf-df71ec387311" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.610 187247 DEBUG nova.compute.manager [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Instance network_info: |[{"id": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "address": "fa:16:3e:be:d2:e8", "network": {"id": "8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1757933674-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b76ae8150c43ac98862da676697b95", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3adaf8b-3f", "ovs_interfaceid": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.613 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Start _get_guest_xml network_info=[{"id": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "address": "fa:16:3e:be:d2:e8", "network": {"id": "8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1757933674-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b76ae8150c43ac98862da676697b95", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3adaf8b-3f", "ovs_interfaceid": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.617 187247 WARNING nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.619 187247 DEBUG nova.virt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestContinuousAudit-server-110327863', uuid='c6ff891f-7953-444d-9ebf-df71ec387311'), owner=OwnerMeta(userid='1987a2346a104d718c951016b26e9a93', username='tempest-TestContinuousAudit-982809030-project-admin', projectid='8aad1654ac0c43c38292ab72dec9fb3a', projectname='tempest-TestContinuousAudit-982809030'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "address": "fa:16:3e:be:d2:e8", "network": {"id": "8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1757933674-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b76ae8150c43ac98862da676697b95", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3adaf8b-3f", "ovs_interfaceid": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764719467.619366) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.623 187247 DEBUG nova.virt.libvirt.host [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.624 187247 DEBUG nova.virt.libvirt.host [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.628 187247 DEBUG nova.virt.libvirt.host [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.628 187247 DEBUG nova.virt.libvirt.host [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.630 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.630 187247 DEBUG nova.virt.hardware [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.630 187247 DEBUG nova.virt.hardware [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.631 187247 DEBUG nova.virt.hardware [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.631 187247 DEBUG nova.virt.hardware [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.631 187247 DEBUG nova.virt.hardware [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.631 187247 DEBUG nova.virt.hardware [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.631 187247 DEBUG nova.virt.hardware [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.632 187247 DEBUG nova.virt.hardware [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.632 187247 DEBUG nova.virt.hardware [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.632 187247 DEBUG nova.virt.hardware [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.632 187247 DEBUG nova.virt.hardware [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.635 187247 DEBUG nova.privsep.utils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.636 187247 DEBUG nova.virt.libvirt.vif [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-02T23:50:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestContinuousAudit-server-110327863',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testcontinuousaudit-server-110327863',id=1,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8aad1654ac0c43c38292ab72dec9fb3a',ramdisk_id='',reservation_id='r-g1daxpid',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestContinuousAudit-982809030',owner_user_name='tempest-TestContinuousAudit-982809030-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:50:59Z,user_data=None,user_id='1987a2346a104d718c951016b26e9a93',uuid=c6ff891f-7953-444d-9ebf-df71ec387311,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "address": "fa:16:3e:be:d2:e8", "network": {"id": "8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1757933674-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b76ae8150c43ac98862da676697b95", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3adaf8b-3f", "ovs_interfaceid": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.636 187247 DEBUG nova.network.os_vif_util [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Converting VIF {"id": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "address": "fa:16:3e:be:d2:e8", "network": {"id": "8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1757933674-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b76ae8150c43ac98862da676697b95", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3adaf8b-3f", "ovs_interfaceid": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.637 187247 DEBUG nova.network.os_vif_util [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:d2:e8,bridge_name='br-int',has_traffic_filtering=True,id=b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60,network=Network(8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3adaf8b-3f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:51:07 compute-0 nova_compute[187243]: 2025-12-02 23:51:07.639 187247 DEBUG nova.objects.instance [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lazy-loading 'pci_devices' on Instance uuid c6ff891f-7953-444d-9ebf-df71ec387311 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.147 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] End _get_guest_xml xml=<domain type="kvm">
Dec 02 23:51:08 compute-0 nova_compute[187243]:   <uuid>c6ff891f-7953-444d-9ebf-df71ec387311</uuid>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   <name>instance-00000001</name>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   <metadata>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <nova:name>tempest-TestContinuousAudit-server-110327863</nova:name>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-02 23:51:07</nova:creationTime>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 02 23:51:08 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:properties>
Dec 02 23:51:08 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         </nova:properties>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       </nova:image>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <nova:owner>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:user uuid="1987a2346a104d718c951016b26e9a93">tempest-TestContinuousAudit-982809030-project-admin</nova:user>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:project uuid="8aad1654ac0c43c38292ab72dec9fb3a">tempest-TestContinuousAudit-982809030</nova:project>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       </nova:owner>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <nova:ports>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         <nova:port uuid="b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60">
Dec 02 23:51:08 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:         </nova:port>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       </nova:ports>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     </nova:instance>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   </metadata>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <system>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <entry name="serial">c6ff891f-7953-444d-9ebf-df71ec387311</entry>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <entry name="uuid">c6ff891f-7953-444d-9ebf-df71ec387311</entry>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     </system>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   </sysinfo>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   <os>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   </os>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   <features>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <acpi/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <apic/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   </features>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   </clock>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   </cpu>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   <devices>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk.config"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:be:d2:e8"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <target dev="tapb3adaf8b-3f"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     </interface>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/console.log" append="off"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     </serial>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <video>
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     </video>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     </rng>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:51:08 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 02 23:51:08 compute-0 nova_compute[187243]:     </memballoon>
Dec 02 23:51:08 compute-0 nova_compute[187243]:   </devices>
Dec 02 23:51:08 compute-0 nova_compute[187243]: </domain>
Dec 02 23:51:08 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.149 187247 DEBUG nova.compute.manager [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Preparing to wait for external event network-vif-plugged-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.149 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Acquiring lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.149 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.149 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.150 187247 DEBUG nova.virt.libvirt.vif [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-02T23:50:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestContinuousAudit-server-110327863',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testcontinuousaudit-server-110327863',id=1,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8aad1654ac0c43c38292ab72dec9fb3a',ramdisk_id='',reservation_id='r-g1daxpid',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestContinuousAudit-982809030',owner_user_name='tempest-TestContinuousAudit-982809030-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:50:59Z,user_data=None,user_id='1987a2346a104d718c951016b26e9a93',uuid=c6ff891f-7953-444d-9ebf-df71ec387311,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "address": "fa:16:3e:be:d2:e8", "network": {"id": "8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1757933674-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b76ae8150c43ac98862da676697b95", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3adaf8b-3f", "ovs_interfaceid": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.150 187247 DEBUG nova.network.os_vif_util [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Converting VIF {"id": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "address": "fa:16:3e:be:d2:e8", "network": {"id": "8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1757933674-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b76ae8150c43ac98862da676697b95", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3adaf8b-3f", "ovs_interfaceid": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.151 187247 DEBUG nova.network.os_vif_util [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:d2:e8,bridge_name='br-int',has_traffic_filtering=True,id=b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60,network=Network(8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3adaf8b-3f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.151 187247 DEBUG os_vif [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:d2:e8,bridge_name='br-int',has_traffic_filtering=True,id=b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60,network=Network(8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3adaf8b-3f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.181 187247 DEBUG ovsdbapp.backend.ovs_idl [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.182 187247 DEBUG ovsdbapp.backend.ovs_idl [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.182 187247 DEBUG ovsdbapp.backend.ovs_idl [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.182 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.183 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.183 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.183 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.184 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.187 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.192 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.192 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.193 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.193 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.194 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '709d7676-a77e-556d-a4a5-753be10719a1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.195 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.197 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:08 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.199 187247 INFO oslo.privsep.daemon [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmps96bbxl1/privsep.sock']
Dec 02 23:51:09 compute-0 nova_compute[187243]: 2025-12-02 23:51:09.029 187247 INFO oslo.privsep.daemon [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Spawned new privsep daemon via rootwrap
Dec 02 23:51:09 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.868 209726 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 23:51:09 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.872 209726 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 23:51:09 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.874 209726 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 02 23:51:09 compute-0 nova_compute[187243]: 2025-12-02 23:51:08.874 209726 INFO oslo.privsep.daemon [-] privsep daemon running as pid 209726
Dec 02 23:51:09 compute-0 nova_compute[187243]: 2025-12-02 23:51:09.291 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:09 compute-0 nova_compute[187243]: 2025-12-02 23:51:09.292 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb3adaf8b-3f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:51:09 compute-0 nova_compute[187243]: 2025-12-02 23:51:09.293 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapb3adaf8b-3f, col_values=(('qos', UUID('3ab3cecb-0760-4144-a8fa-3b795bbd0907')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:51:09 compute-0 nova_compute[187243]: 2025-12-02 23:51:09.295 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapb3adaf8b-3f, col_values=(('external_ids', {'iface-id': 'b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:d2:e8', 'vm-uuid': 'c6ff891f-7953-444d-9ebf-df71ec387311'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:51:09 compute-0 nova_compute[187243]: 2025-12-02 23:51:09.327 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:09 compute-0 NetworkManager[55671]: <info>  [1764719469.3286] manager: (tapb3adaf8b-3f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Dec 02 23:51:09 compute-0 nova_compute[187243]: 2025-12-02 23:51:09.330 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:51:09 compute-0 nova_compute[187243]: 2025-12-02 23:51:09.339 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:09 compute-0 nova_compute[187243]: 2025-12-02 23:51:09.340 187247 INFO os_vif [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:d2:e8,bridge_name='br-int',has_traffic_filtering=True,id=b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60,network=Network(8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3adaf8b-3f')
Dec 02 23:51:10 compute-0 nova_compute[187243]: 2025-12-02 23:51:10.396 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:10 compute-0 nova_compute[187243]: 2025-12-02 23:51:10.903 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:51:10 compute-0 nova_compute[187243]: 2025-12-02 23:51:10.903 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:51:10 compute-0 nova_compute[187243]: 2025-12-02 23:51:10.903 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] No VIF found with MAC fa:16:3e:be:d2:e8, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 02 23:51:10 compute-0 nova_compute[187243]: 2025-12-02 23:51:10.904 187247 INFO nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Using config drive
Dec 02 23:51:11 compute-0 nova_compute[187243]: 2025-12-02 23:51:11.417 187247 WARNING neutronclient.v2_0.client [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:51:12 compute-0 nova_compute[187243]: 2025-12-02 23:51:12.113 187247 INFO nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Creating config drive at /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk.config
Dec 02 23:51:12 compute-0 nova_compute[187243]: 2025-12-02 23:51:12.119 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp6p79n_rj execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:51:12 compute-0 nova_compute[187243]: 2025-12-02 23:51:12.257 187247 DEBUG oslo_concurrency.processutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp6p79n_rj" returned: 0 in 0.139s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:51:12 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec 02 23:51:12 compute-0 NetworkManager[55671]: <info>  [1764719472.3605] manager: (tapb3adaf8b-3f): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Dec 02 23:51:12 compute-0 kernel: tapb3adaf8b-3f: entered promiscuous mode
Dec 02 23:51:12 compute-0 ovn_controller[95488]: 2025-12-02T23:51:12Z|00040|binding|INFO|Claiming lport b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 for this chassis.
Dec 02 23:51:12 compute-0 ovn_controller[95488]: 2025-12-02T23:51:12Z|00041|binding|INFO|b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60: Claiming fa:16:3e:be:d2:e8 10.100.0.12
Dec 02 23:51:12 compute-0 nova_compute[187243]: 2025-12-02 23:51:12.366 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:12 compute-0 nova_compute[187243]: 2025-12-02 23:51:12.370 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:12 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:12.382 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:d2:e8 10.100.0.12'], port_security=['fa:16:3e:be:d2:e8 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c6ff891f-7953-444d-9ebf-df71ec387311', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aad1654ac0c43c38292ab72dec9fb3a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f54708b2-ddd7-45c4-9692-72d53e4b62eb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d26e027-073c-46dd-95e9-4c77fc749b25, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:51:12 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:12.383 104379 INFO neutron.agent.ovn.metadata.agent [-] Port b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 in datapath 8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522 bound to our chassis
Dec 02 23:51:12 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:12.385 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522
Dec 02 23:51:12 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:12.421 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[02db6377-7dd7-46a7-80d7-2a3a5e1f499c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:12 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:12.422 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8af73ffa-61 in ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 02 23:51:12 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:12.426 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8af73ffa-60 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 02 23:51:12 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:12.426 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b2e4127e-6c05-4794-aff8-1487b1a12cee]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:12 compute-0 systemd-udevd[209753]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:51:12 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:12.428 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ee6bfd7d-64af-4d44-b6a0-e6e4f20cd46a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:12 compute-0 NetworkManager[55671]: <info>  [1764719472.4531] device (tapb3adaf8b-3f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:51:12 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:12.450 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[27891702-d70a-4990-a460-57abe3071a91]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:12 compute-0 NetworkManager[55671]: <info>  [1764719472.4573] device (tapb3adaf8b-3f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 02 23:51:12 compute-0 systemd-machined[153518]: New machine qemu-1-instance-00000001.
Dec 02 23:51:12 compute-0 nova_compute[187243]: 2025-12-02 23:51:12.475 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:12 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:12.481 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[234bb78f-517b-41e0-9045-ad2afc4b3866]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:12 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:12.482 104379 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp4x_hxsx6/privsep.sock']
Dec 02 23:51:12 compute-0 ovn_controller[95488]: 2025-12-02T23:51:12Z|00042|binding|INFO|Setting lport b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 ovn-installed in OVS
Dec 02 23:51:12 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Dec 02 23:51:12 compute-0 ovn_controller[95488]: 2025-12-02T23:51:12Z|00043|binding|INFO|Setting lport b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 up in Southbound
Dec 02 23:51:12 compute-0 nova_compute[187243]: 2025-12-02 23:51:12.485 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:13 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:13.380 104379 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 02 23:51:13 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:13.381 104379 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp4x_hxsx6/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Dec 02 23:51:13 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:13.146 209783 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 02 23:51:13 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:13.149 209783 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 02 23:51:13 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:13.151 209783 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 02 23:51:13 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:13.151 209783 INFO oslo.privsep.daemon [-] privsep daemon running as pid 209783
Dec 02 23:51:13 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:13.383 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[0c74f3a6-f994-49cc-b434-28b07877fc9f]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:13 compute-0 nova_compute[187243]: 2025-12-02 23:51:13.535 187247 DEBUG nova.compute.manager [req-e2770bd5-3781-47b4-8e54-edf6eb9fc0bd req-f55681f3-442a-4e4e-b055-4c7d7a9666d6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Received event network-vif-plugged-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:51:13 compute-0 nova_compute[187243]: 2025-12-02 23:51:13.536 187247 DEBUG oslo_concurrency.lockutils [req-e2770bd5-3781-47b4-8e54-edf6eb9fc0bd req-f55681f3-442a-4e4e-b055-4c7d7a9666d6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:13 compute-0 nova_compute[187243]: 2025-12-02 23:51:13.536 187247 DEBUG oslo_concurrency.lockutils [req-e2770bd5-3781-47b4-8e54-edf6eb9fc0bd req-f55681f3-442a-4e4e-b055-4c7d7a9666d6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:13 compute-0 nova_compute[187243]: 2025-12-02 23:51:13.536 187247 DEBUG oslo_concurrency.lockutils [req-e2770bd5-3781-47b4-8e54-edf6eb9fc0bd req-f55681f3-442a-4e4e-b055-4c7d7a9666d6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:13 compute-0 nova_compute[187243]: 2025-12-02 23:51:13.537 187247 DEBUG nova.compute.manager [req-e2770bd5-3781-47b4-8e54-edf6eb9fc0bd req-f55681f3-442a-4e4e-b055-4c7d7a9666d6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Processing event network-vif-plugged-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 02 23:51:13 compute-0 nova_compute[187243]: 2025-12-02 23:51:13.538 187247 DEBUG nova.compute.manager [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 02 23:51:13 compute-0 nova_compute[187243]: 2025-12-02 23:51:13.542 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 02 23:51:13 compute-0 nova_compute[187243]: 2025-12-02 23:51:13.546 187247 INFO nova.virt.libvirt.driver [-] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Instance spawned successfully.
Dec 02 23:51:13 compute-0 nova_compute[187243]: 2025-12-02 23:51:13.546 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 02 23:51:13 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:13.875 209783 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:13 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:13.875 209783 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:13 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:13.875 209783 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:14 compute-0 nova_compute[187243]: 2025-12-02 23:51:14.063 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:51:14 compute-0 nova_compute[187243]: 2025-12-02 23:51:14.064 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:51:14 compute-0 nova_compute[187243]: 2025-12-02 23:51:14.065 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:51:14 compute-0 nova_compute[187243]: 2025-12-02 23:51:14.065 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:51:14 compute-0 nova_compute[187243]: 2025-12-02 23:51:14.066 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:51:14 compute-0 nova_compute[187243]: 2025-12-02 23:51:14.067 187247 DEBUG nova.virt.libvirt.driver [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:51:14 compute-0 nova_compute[187243]: 2025-12-02 23:51:14.361 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.439 209783 INFO oslo_service.backend [-] Loading backend: eventlet
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.444 209783 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.525 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[506eca01-1128-4388-aabd-2244b3ab2970]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.552 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0bf95935-0952-4c12-898a-12409af64262]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:14 compute-0 NetworkManager[55671]: <info>  [1764719474.5546] manager: (tap8af73ffa-60): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Dec 02 23:51:14 compute-0 systemd-udevd[209754]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:51:14 compute-0 nova_compute[187243]: 2025-12-02 23:51:14.580 187247 INFO nova.compute.manager [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Took 13.98 seconds to spawn the instance on the hypervisor.
Dec 02 23:51:14 compute-0 nova_compute[187243]: 2025-12-02 23:51:14.582 187247 DEBUG nova.compute.manager [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.603 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[97f3390c-f1a2-43c7-b957-9bfeb99bcbc0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.606 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[93debe37-de74-40f3-a01a-5103445e86ed]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:14 compute-0 NetworkManager[55671]: <info>  [1764719474.6394] device (tap8af73ffa-60): carrier: link connected
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.652 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[cef467cc-433c-4aaf-be93-8bde76049644]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.686 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d6f8488e-b569-4e1b-b2ba-b683e8ca84b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8af73ffa-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9a:7d:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 356528, 'reachable_time': 35790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209810, 'error': None, 'target': 'ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.713 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9598f1d3-71eb-4f4e-a38b-b2be39769cf4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9a:7d48'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 356528, 'tstamp': 356528}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209811, 'error': None, 'target': 'ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.741 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[525e58fe-9ee6-4f08-b9be-2e9832f3148e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8af73ffa-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9a:7d:48'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 356528, 'reachable_time': 35790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209812, 'error': None, 'target': 'ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.800 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ed39be-c8a6-4222-a856-0d9b0a3118b4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.880 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[cb32ca71-e97e-4639-86a4-c5ed03ef0cf1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.882 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8af73ffa-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.882 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.883 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8af73ffa-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:51:14 compute-0 nova_compute[187243]: 2025-12-02 23:51:14.885 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:14 compute-0 kernel: tap8af73ffa-60: entered promiscuous mode
Dec 02 23:51:14 compute-0 NetworkManager[55671]: <info>  [1764719474.8897] manager: (tap8af73ffa-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Dec 02 23:51:14 compute-0 nova_compute[187243]: 2025-12-02 23:51:14.889 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.900 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8af73ffa-60, col_values=(('external_ids', {'iface-id': '7b376433-fabc-48f0-aa10-0098b8d1cf58'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:51:14 compute-0 ovn_controller[95488]: 2025-12-02T23:51:14Z|00044|binding|INFO|Releasing lport 7b376433-fabc-48f0-aa10-0098b8d1cf58 from this chassis (sb_readonly=0)
Dec 02 23:51:14 compute-0 nova_compute[187243]: 2025-12-02 23:51:14.902 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:14 compute-0 nova_compute[187243]: 2025-12-02 23:51:14.926 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.928 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[bcc231b6-c2c4-4288-90aa-63046adcc6d0]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.929 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.929 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.929 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.929 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.930 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3e1df7cb-2a3f-43d0-94a7-32799f5ef32d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.930 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.931 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6a81d48b-a6ab-4b5d-824e-ff455bde7287]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.932 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: global
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522.pid.haproxy
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: defaults
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     log global
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID 8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 02 23:51:14 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:14.933 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'env', 'PROCESS_TAG=haproxy-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 02 23:51:15 compute-0 nova_compute[187243]: 2025-12-02 23:51:15.167 187247 INFO nova.compute.manager [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Took 19.39 seconds to build instance.
Dec 02 23:51:15 compute-0 nova_compute[187243]: 2025-12-02 23:51:15.429 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:15 compute-0 podman[209845]: 2025-12-02 23:51:15.364460208 +0000 UTC m=+0.026483882 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 02 23:51:15 compute-0 podman[209845]: 2025-12-02 23:51:15.475308981 +0000 UTC m=+0.137332645 container create 972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Dec 02 23:51:15 compute-0 systemd[1]: Started libpod-conmon-972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b.scope.
Dec 02 23:51:15 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:51:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2db713772e22ce1cb73c37f099a773a29602015d65c8ecb7abf9066b6c408ffe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 23:51:15 compute-0 podman[209845]: 2025-12-02 23:51:15.596192238 +0000 UTC m=+0.258215862 container init 972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:51:15 compute-0 podman[209858]: 2025-12-02 23:51:15.597286195 +0000 UTC m=+0.076049433 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:51:15 compute-0 podman[209845]: 2025-12-02 23:51:15.601359713 +0000 UTC m=+0.263383337 container start 972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4)
Dec 02 23:51:15 compute-0 neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522[209866]: [NOTICE]   (209886) : New worker (209889) forked
Dec 02 23:51:15 compute-0 neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522[209866]: [NOTICE]   (209886) : Loading success.
Dec 02 23:51:15 compute-0 nova_compute[187243]: 2025-12-02 23:51:15.679 187247 DEBUG oslo_concurrency.lockutils [None req-a6dcd606-8d43-45e0-ac65-d5e06a0d88fd 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.930s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:15 compute-0 nova_compute[187243]: 2025-12-02 23:51:15.778 187247 DEBUG nova.compute.manager [req-789731d7-23bd-419d-ab51-d250bd6d20b3 req-d60aa6d8-8cd7-4f91-a4f4-d9b4a3d71f9b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Received event network-vif-plugged-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:51:15 compute-0 nova_compute[187243]: 2025-12-02 23:51:15.778 187247 DEBUG oslo_concurrency.lockutils [req-789731d7-23bd-419d-ab51-d250bd6d20b3 req-d60aa6d8-8cd7-4f91-a4f4-d9b4a3d71f9b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:15 compute-0 nova_compute[187243]: 2025-12-02 23:51:15.778 187247 DEBUG oslo_concurrency.lockutils [req-789731d7-23bd-419d-ab51-d250bd6d20b3 req-d60aa6d8-8cd7-4f91-a4f4-d9b4a3d71f9b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:15 compute-0 nova_compute[187243]: 2025-12-02 23:51:15.779 187247 DEBUG oslo_concurrency.lockutils [req-789731d7-23bd-419d-ab51-d250bd6d20b3 req-d60aa6d8-8cd7-4f91-a4f4-d9b4a3d71f9b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:15 compute-0 nova_compute[187243]: 2025-12-02 23:51:15.779 187247 DEBUG nova.compute.manager [req-789731d7-23bd-419d-ab51-d250bd6d20b3 req-d60aa6d8-8cd7-4f91-a4f4-d9b4a3d71f9b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] No waiting events found dispatching network-vif-plugged-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:51:15 compute-0 nova_compute[187243]: 2025-12-02 23:51:15.779 187247 WARNING nova.compute.manager [req-789731d7-23bd-419d-ab51-d250bd6d20b3 req-d60aa6d8-8cd7-4f91-a4f4-d9b4a3d71f9b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Received unexpected event network-vif-plugged-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 for instance with vm_state active and task_state None.
Dec 02 23:51:16 compute-0 sshd-session[209898]: Invalid user david from 23.95.37.90 port 34908
Dec 02 23:51:16 compute-0 sshd-session[209898]: Received disconnect from 23.95.37.90 port 34908:11: Bye Bye [preauth]
Dec 02 23:51:16 compute-0 sshd-session[209898]: Disconnected from invalid user david 23.95.37.90 port 34908 [preauth]
Dec 02 23:51:19 compute-0 podman[209900]: 2025-12-02 23:51:19.113626997 +0000 UTC m=+0.057393531 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 02 23:51:19 compute-0 podman[209901]: 2025-12-02 23:51:19.176494129 +0000 UTC m=+0.126501994 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:51:19 compute-0 nova_compute[187243]: 2025-12-02 23:51:19.363 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:20 compute-0 nova_compute[187243]: 2025-12-02 23:51:20.478 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:23 compute-0 nova_compute[187243]: 2025-12-02 23:51:23.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:23 compute-0 nova_compute[187243]: 2025-12-02 23:51:23.593 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:51:24 compute-0 nova_compute[187243]: 2025-12-02 23:51:24.365 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:24 compute-0 nova_compute[187243]: 2025-12-02 23:51:24.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:24 compute-0 nova_compute[187243]: 2025-12-02 23:51:24.594 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:25 compute-0 nova_compute[187243]: 2025-12-02 23:51:25.480 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:25 compute-0 nova_compute[187243]: 2025-12-02 23:51:25.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:26 compute-0 ovn_controller[95488]: 2025-12-02T23:51:26Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:be:d2:e8 10.100.0.12
Dec 02 23:51:26 compute-0 ovn_controller[95488]: 2025-12-02T23:51:26Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:be:d2:e8 10.100.0.12
Dec 02 23:51:26 compute-0 nova_compute[187243]: 2025-12-02 23:51:26.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:27 compute-0 nova_compute[187243]: 2025-12-02 23:51:27.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:28 compute-0 nova_compute[187243]: 2025-12-02 23:51:28.104 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:28 compute-0 nova_compute[187243]: 2025-12-02 23:51:28.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:28 compute-0 nova_compute[187243]: 2025-12-02 23:51:28.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:28 compute-0 nova_compute[187243]: 2025-12-02 23:51:28.105 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:51:29 compute-0 nova_compute[187243]: 2025-12-02 23:51:29.148 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:51:29 compute-0 nova_compute[187243]: 2025-12-02 23:51:29.229 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:51:29 compute-0 nova_compute[187243]: 2025-12-02 23:51:29.230 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:51:29 compute-0 nova_compute[187243]: 2025-12-02 23:51:29.303 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:51:29 compute-0 nova_compute[187243]: 2025-12-02 23:51:29.367 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:29 compute-0 nova_compute[187243]: 2025-12-02 23:51:29.470 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:51:29 compute-0 nova_compute[187243]: 2025-12-02 23:51:29.473 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:51:29 compute-0 nova_compute[187243]: 2025-12-02 23:51:29.505 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:51:29 compute-0 nova_compute[187243]: 2025-12-02 23:51:29.506 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5677MB free_disk=73.14057159423828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:51:29 compute-0 nova_compute[187243]: 2025-12-02 23:51:29.507 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:29 compute-0 nova_compute[187243]: 2025-12-02 23:51:29.507 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:29 compute-0 podman[197600]: time="2025-12-02T23:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:51:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:51:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3045 "" "Go-http-client/1.1"
Dec 02 23:51:30 compute-0 nova_compute[187243]: 2025-12-02 23:51:30.482 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:30 compute-0 nova_compute[187243]: 2025-12-02 23:51:30.559 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance c6ff891f-7953-444d-9ebf-df71ec387311 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:51:30 compute-0 nova_compute[187243]: 2025-12-02 23:51:30.560 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:51:30 compute-0 nova_compute[187243]: 2025-12-02 23:51:30.560 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:51:29 up 59 min,  0 user,  load average: 0.47, 0.40, 0.50\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_8aad1654ac0c43c38292ab72dec9fb3a': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:51:30 compute-0 nova_compute[187243]: 2025-12-02 23:51:30.643 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:51:31 compute-0 podman[209980]: 2025-12-02 23:51:31.116045281 +0000 UTC m=+0.065059116 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.181 187247 ERROR nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [req-d5d696bd-2524-439f-838c-63f7b8a5a7fa] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 0d6e1fe8-f800-4b94-a0c0-ea75083d5248.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-d5d696bd-2524-439f-838c-63f7b8a5a7fa"}]}
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.213 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing inventories for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.230 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating ProviderTree inventory for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.231 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.253 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing aggregate associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.287 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing trait associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_SOUND_MODEL_AC97,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOUND_MODEL_ICH9,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.327 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:51:31 compute-0 openstack_network_exporter[199746]: ERROR   23:51:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:51:31 compute-0 openstack_network_exporter[199746]: ERROR   23:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:51:31 compute-0 openstack_network_exporter[199746]: ERROR   23:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:51:31 compute-0 openstack_network_exporter[199746]: ERROR   23:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:51:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:51:31 compute-0 openstack_network_exporter[199746]: ERROR   23:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:51:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.749 187247 DEBUG oslo_concurrency.lockutils [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Acquiring lock "c6ff891f-7953-444d-9ebf-df71ec387311" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.750 187247 DEBUG oslo_concurrency.lockutils [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.750 187247 DEBUG oslo_concurrency.lockutils [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Acquiring lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.750 187247 DEBUG oslo_concurrency.lockutils [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.750 187247 DEBUG oslo_concurrency.lockutils [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.768 187247 INFO nova.compute.manager [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Terminating instance
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.888 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updated inventory for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.888 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 02 23:51:31 compute-0 nova_compute[187243]: 2025-12-02 23:51:31.889 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.291 187247 DEBUG nova.compute.manager [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 02 23:51:32 compute-0 kernel: tapb3adaf8b-3f (unregistering): left promiscuous mode
Dec 02 23:51:32 compute-0 NetworkManager[55671]: <info>  [1764719492.3158] device (tapb3adaf8b-3f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 02 23:51:32 compute-0 ovn_controller[95488]: 2025-12-02T23:51:32Z|00045|binding|INFO|Releasing lport b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 from this chassis (sb_readonly=0)
Dec 02 23:51:32 compute-0 ovn_controller[95488]: 2025-12-02T23:51:32Z|00046|binding|INFO|Setting lport b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 down in Southbound
Dec 02 23:51:32 compute-0 ovn_controller[95488]: 2025-12-02T23:51:32Z|00047|binding|INFO|Removing iface tapb3adaf8b-3f ovn-installed in OVS
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.327 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.335 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:d2:e8 10.100.0.12'], port_security=['fa:16:3e:be:d2:e8 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c6ff891f-7953-444d-9ebf-df71ec387311', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aad1654ac0c43c38292ab72dec9fb3a', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f54708b2-ddd7-45c4-9692-72d53e4b62eb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d26e027-073c-46dd-95e9-4c77fc749b25, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.337 104379 INFO neutron.agent.ovn.metadata.agent [-] Port b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 in datapath 8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522 unbound from our chassis
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.338 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.339 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc11668-c81c-43b9-87e7-a33157ef9335]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.340 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522 namespace which is not needed anymore
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.345 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:32 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Dec 02 23:51:32 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 13.218s CPU time.
Dec 02 23:51:32 compute-0 systemd-machined[153518]: Machine qemu-1-instance-00000001 terminated.
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.401 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.402 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.894s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:32 compute-0 neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522[209866]: [NOTICE]   (209886) : haproxy version is 3.0.5-8e879a5
Dec 02 23:51:32 compute-0 neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522[209866]: [NOTICE]   (209886) : path to executable is /usr/sbin/haproxy
Dec 02 23:51:32 compute-0 neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522[209866]: [WARNING]  (209886) : Exiting Master process...
Dec 02 23:51:32 compute-0 podman[210027]: 2025-12-02 23:51:32.476740824 +0000 UTC m=+0.044722354 container kill 972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 23:51:32 compute-0 neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522[209866]: [ALERT]    (209886) : Current worker (209889) exited with code 143 (Terminated)
Dec 02 23:51:32 compute-0 neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522[209866]: [WARNING]  (209886) : All workers exited. Exiting... (0)
Dec 02 23:51:32 compute-0 systemd[1]: libpod-972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b.scope: Deactivated successfully.
Dec 02 23:51:32 compute-0 podman[210045]: 2025-12-02 23:51:32.534140973 +0000 UTC m=+0.027772503 container died 972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.542 187247 DEBUG nova.compute.manager [req-a31aa37a-c120-4965-bf67-fb999308220c req-2e52f805-9e46-4103-b8f1-f65d8cc80058 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Received event network-vif-unplugged-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.542 187247 DEBUG oslo_concurrency.lockutils [req-a31aa37a-c120-4965-bf67-fb999308220c req-2e52f805-9e46-4103-b8f1-f65d8cc80058 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.543 187247 DEBUG oslo_concurrency.lockutils [req-a31aa37a-c120-4965-bf67-fb999308220c req-2e52f805-9e46-4103-b8f1-f65d8cc80058 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.543 187247 DEBUG oslo_concurrency.lockutils [req-a31aa37a-c120-4965-bf67-fb999308220c req-2e52f805-9e46-4103-b8f1-f65d8cc80058 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.543 187247 DEBUG nova.compute.manager [req-a31aa37a-c120-4965-bf67-fb999308220c req-2e52f805-9e46-4103-b8f1-f65d8cc80058 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] No waiting events found dispatching network-vif-unplugged-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.543 187247 DEBUG nova.compute.manager [req-a31aa37a-c120-4965-bf67-fb999308220c req-2e52f805-9e46-4103-b8f1-f65d8cc80058 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Received event network-vif-unplugged-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.555 187247 INFO nova.virt.libvirt.driver [-] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Instance destroyed successfully.
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.555 187247 DEBUG nova.objects.instance [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lazy-loading 'resources' on Instance uuid c6ff891f-7953-444d-9ebf-df71ec387311 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:51:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b-userdata-shm.mount: Deactivated successfully.
Dec 02 23:51:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-2db713772e22ce1cb73c37f099a773a29602015d65c8ecb7abf9066b6c408ffe-merged.mount: Deactivated successfully.
Dec 02 23:51:32 compute-0 podman[210045]: 2025-12-02 23:51:32.586826429 +0000 UTC m=+0.080457949 container remove 972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.591 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d2403efb-7ed2-4819-b4c0-1af817a1d5f2]: (4, ("Tue Dec  2 11:51:32 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522 (972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b)\n972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b\nTue Dec  2 11:51:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522 (972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b)\n972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:32 compute-0 systemd[1]: libpod-conmon-972d5d8eb5248300830e5e4932db535facb6543715843a16aa06085b4793067b.scope: Deactivated successfully.
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.593 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c9120e84-236d-422b-a8dd-c8cf99fd6803]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.593 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.593 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e10216d3-8aaf-42a1-adcd-cb2b12a2a686]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.594 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8af73ffa-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.596 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:32 compute-0 kernel: tap8af73ffa-60: left promiscuous mode
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.607 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:32 compute-0 nova_compute[187243]: 2025-12-02 23:51:32.611 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.613 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f3731777-2547-4183-b667-bb72a6fd410c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.629 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4287175b-0ae0-4cbf-9a9b-e69008dcc2f9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.631 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb78742-cb2f-4ebd-9200-38697d58797d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.646 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2f51ad3f-17b2-4c29-a04f-d7e3040da148]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 356516, 'reachable_time': 41924, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210091, 'error': None, 'target': 'ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:32 compute-0 systemd[1]: run-netns-ovnmeta\x2d8af73ffa\x2d6c2c\x2d49c7\x2d87e5\x2de2d4e6d2e522.mount: Deactivated successfully.
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.650 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 02 23:51:32 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:32.651 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d6a149-44ec-4a29-8219-ca227fcdd684]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.061 187247 DEBUG nova.virt.libvirt.vif [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-02T23:50:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestContinuousAudit-server-110327863',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testcontinuousaudit-server-110327863',id=1,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:51:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8aad1654ac0c43c38292ab72dec9fb3a',ramdisk_id='',reservation_id='r-g1daxpid',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestContinuousAudit-982809030',owner_user_name='tempest-TestContinuousAudit-982809030-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:51:14Z,user_data=None,user_id='1987a2346a104d718c951016b26e9a93',uuid=c6ff891f-7953-444d-9ebf-df71ec387311,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "address": "fa:16:3e:be:d2:e8", "network": {"id": "8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1757933674-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b76ae8150c43ac98862da676697b95", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3adaf8b-3f", "ovs_interfaceid": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.062 187247 DEBUG nova.network.os_vif_util [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Converting VIF {"id": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "address": "fa:16:3e:be:d2:e8", "network": {"id": "8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1757933674-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b76ae8150c43ac98862da676697b95", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3adaf8b-3f", "ovs_interfaceid": "b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.062 187247 DEBUG nova.network.os_vif_util [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:d2:e8,bridge_name='br-int',has_traffic_filtering=True,id=b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60,network=Network(8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3adaf8b-3f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.063 187247 DEBUG os_vif [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:d2:e8,bridge_name='br-int',has_traffic_filtering=True,id=b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60,network=Network(8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3adaf8b-3f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.065 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.065 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3adaf8b-3f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.066 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.068 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.069 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.069 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=3ab3cecb-0760-4144-a8fa-3b795bbd0907) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.070 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.071 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.073 187247 INFO os_vif [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:d2:e8,bridge_name='br-int',has_traffic_filtering=True,id=b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60,network=Network(8af73ffa-6c2c-49c7-87e5-e2d4e6d2e522),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3adaf8b-3f')
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.074 187247 INFO nova.virt.libvirt.driver [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Deleting instance files /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311_del
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.075 187247 INFO nova.virt.libvirt.driver [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Deletion of /var/lib/nova/instances/c6ff891f-7953-444d-9ebf-df71ec387311_del complete
Dec 02 23:51:33 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:33.075 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:51:33 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:33.076 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.078 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.402 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.403 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.588 187247 INFO nova.compute.manager [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Took 1.30 seconds to destroy the instance on the hypervisor.
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.589 187247 DEBUG oslo.service.backend._eventlet.loopingcall [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.589 187247 DEBUG nova.compute.manager [-] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.590 187247 DEBUG nova.network.neutron [-] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 02 23:51:33 compute-0 nova_compute[187243]: 2025-12-02 23:51:33.590 187247 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:51:34 compute-0 nova_compute[187243]: 2025-12-02 23:51:34.010 187247 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:51:34 compute-0 nova_compute[187243]: 2025-12-02 23:51:34.440 187247 DEBUG nova.compute.manager [req-98aa9614-95cd-4c22-9ead-05edce769e2a req-8bc38045-c3e0-4a4e-b334-6d5fecbdfeab 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Received event network-vif-deleted-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:51:34 compute-0 nova_compute[187243]: 2025-12-02 23:51:34.441 187247 INFO nova.compute.manager [req-98aa9614-95cd-4c22-9ead-05edce769e2a req-8bc38045-c3e0-4a4e-b334-6d5fecbdfeab 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Neutron deleted interface b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60; detaching it from the instance and deleting it from the info cache
Dec 02 23:51:34 compute-0 nova_compute[187243]: 2025-12-02 23:51:34.441 187247 DEBUG nova.network.neutron [req-98aa9614-95cd-4c22-9ead-05edce769e2a req-8bc38045-c3e0-4a4e-b334-6d5fecbdfeab 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:51:34 compute-0 nova_compute[187243]: 2025-12-02 23:51:34.618 187247 DEBUG nova.compute.manager [req-4c89e557-2829-4b44-8833-7fa80b9ff107 req-5d223b65-3f83-4b84-8af2-6f4905b15615 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Received event network-vif-unplugged-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:51:34 compute-0 nova_compute[187243]: 2025-12-02 23:51:34.619 187247 DEBUG oslo_concurrency.lockutils [req-4c89e557-2829-4b44-8833-7fa80b9ff107 req-5d223b65-3f83-4b84-8af2-6f4905b15615 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:34 compute-0 nova_compute[187243]: 2025-12-02 23:51:34.619 187247 DEBUG oslo_concurrency.lockutils [req-4c89e557-2829-4b44-8833-7fa80b9ff107 req-5d223b65-3f83-4b84-8af2-6f4905b15615 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:34 compute-0 nova_compute[187243]: 2025-12-02 23:51:34.620 187247 DEBUG oslo_concurrency.lockutils [req-4c89e557-2829-4b44-8833-7fa80b9ff107 req-5d223b65-3f83-4b84-8af2-6f4905b15615 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:34 compute-0 nova_compute[187243]: 2025-12-02 23:51:34.620 187247 DEBUG nova.compute.manager [req-4c89e557-2829-4b44-8833-7fa80b9ff107 req-5d223b65-3f83-4b84-8af2-6f4905b15615 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] No waiting events found dispatching network-vif-unplugged-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:51:34 compute-0 nova_compute[187243]: 2025-12-02 23:51:34.620 187247 DEBUG nova.compute.manager [req-4c89e557-2829-4b44-8833-7fa80b9ff107 req-5d223b65-3f83-4b84-8af2-6f4905b15615 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Received event network-vif-unplugged-b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:51:34 compute-0 nova_compute[187243]: 2025-12-02 23:51:34.882 187247 DEBUG nova.network.neutron [-] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:51:34 compute-0 nova_compute[187243]: 2025-12-02 23:51:34.951 187247 DEBUG nova.compute.manager [req-98aa9614-95cd-4c22-9ead-05edce769e2a req-8bc38045-c3e0-4a4e-b334-6d5fecbdfeab 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Detach interface failed, port_id=b3adaf8b-3f3d-455b-8066-e3eeb7d7bd60, reason: Instance c6ff891f-7953-444d-9ebf-df71ec387311 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 02 23:51:35 compute-0 nova_compute[187243]: 2025-12-02 23:51:35.389 187247 INFO nova.compute.manager [-] [instance: c6ff891f-7953-444d-9ebf-df71ec387311] Took 1.80 seconds to deallocate network for instance.
Dec 02 23:51:35 compute-0 nova_compute[187243]: 2025-12-02 23:51:35.484 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:35 compute-0 nova_compute[187243]: 2025-12-02 23:51:35.911 187247 DEBUG oslo_concurrency.lockutils [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:51:35 compute-0 nova_compute[187243]: 2025-12-02 23:51:35.912 187247 DEBUG oslo_concurrency.lockutils [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:51:35 compute-0 nova_compute[187243]: 2025-12-02 23:51:35.957 187247 DEBUG nova.compute.provider_tree [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:51:36 compute-0 podman[210094]: 2025-12-02 23:51:36.148640682 +0000 UTC m=+0.082691933 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 02 23:51:36 compute-0 nova_compute[187243]: 2025-12-02 23:51:36.466 187247 DEBUG nova.scheduler.client.report [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:51:36 compute-0 nova_compute[187243]: 2025-12-02 23:51:36.978 187247 DEBUG oslo_concurrency.lockutils [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.066s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:37 compute-0 nova_compute[187243]: 2025-12-02 23:51:37.000 187247 INFO nova.scheduler.client.report [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Deleted allocations for instance c6ff891f-7953-444d-9ebf-df71ec387311
Dec 02 23:51:38 compute-0 nova_compute[187243]: 2025-12-02 23:51:38.031 187247 DEBUG oslo_concurrency.lockutils [None req-7b873577-1fbb-45ea-b1f9-c82274f187ce 1987a2346a104d718c951016b26e9a93 8aad1654ac0c43c38292ab72dec9fb3a - - default default] Lock "c6ff891f-7953-444d-9ebf-df71ec387311" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.282s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:51:38 compute-0 nova_compute[187243]: 2025-12-02 23:51:38.070 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:38 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:38.077 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:51:38 compute-0 sshd-session[210114]: Received disconnect from 61.220.235.10 port 42464:11: Bye Bye [preauth]
Dec 02 23:51:38 compute-0 sshd-session[210114]: Disconnected from authenticating user root 61.220.235.10 port 42464 [preauth]
Dec 02 23:51:40 compute-0 nova_compute[187243]: 2025-12-02 23:51:40.486 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:43 compute-0 sshd-session[210116]: Received disconnect from 45.78.219.95 port 46032:11: Bye Bye [preauth]
Dec 02 23:51:43 compute-0 sshd-session[210116]: Disconnected from authenticating user root 45.78.219.95 port 46032 [preauth]
Dec 02 23:51:43 compute-0 nova_compute[187243]: 2025-12-02 23:51:43.118 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:43 compute-0 nova_compute[187243]: 2025-12-02 23:51:43.863 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:45 compute-0 nova_compute[187243]: 2025-12-02 23:51:45.488 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:46 compute-0 podman[210120]: 2025-12-02 23:51:46.138726636 +0000 UTC m=+0.074957236 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:51:47 compute-0 sshd-session[210144]: Invalid user odin from 20.123.120.169 port 60070
Dec 02 23:51:47 compute-0 sshd-session[210144]: Received disconnect from 20.123.120.169 port 60070:11: Bye Bye [preauth]
Dec 02 23:51:47 compute-0 sshd-session[210144]: Disconnected from invalid user odin 20.123.120.169 port 60070 [preauth]
Dec 02 23:51:48 compute-0 nova_compute[187243]: 2025-12-02 23:51:48.121 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:50 compute-0 podman[210146]: 2025-12-02 23:51:50.088923081 +0000 UTC m=+0.045497862 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 02 23:51:50 compute-0 podman[210147]: 2025-12-02 23:51:50.134689889 +0000 UTC m=+0.088905113 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:51:50 compute-0 nova_compute[187243]: 2025-12-02 23:51:50.490 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:53 compute-0 nova_compute[187243]: 2025-12-02 23:51:53.123 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:55 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:55.420 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:40:c1 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3957003f9ee8492688556ccb0cd5fdd5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0f7b0db-e402-4898-a011-4ec4b4fff19a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=64808fc3-a961-45a4-b901-b4f1049f2d12) old=Port_Binding(mac=['fa:16:3e:56:40:c1'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3957003f9ee8492688556ccb0cd5fdd5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:51:55 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:55.421 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 64808fc3-a961-45a4-b901-b4f1049f2d12 in datapath 2c29168d-89f5-4fdd-a1dd-76c0a34cef80 updated
Dec 02 23:51:55 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:55.422 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2c29168d-89f5-4fdd-a1dd-76c0a34cef80, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:51:55 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:51:55.423 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[492f439a-5724-42ac-954e-ad9fddcd9236]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:51:55 compute-0 nova_compute[187243]: 2025-12-02 23:51:55.492 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:58 compute-0 nova_compute[187243]: 2025-12-02 23:51:58.125 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:51:59 compute-0 podman[197600]: time="2025-12-02T23:51:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:51:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:51:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:51:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:51:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2570 "" "Go-http-client/1.1"
Dec 02 23:52:00 compute-0 nova_compute[187243]: 2025-12-02 23:52:00.495 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:00.672 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:00.672 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:00.672 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:01 compute-0 openstack_network_exporter[199746]: ERROR   23:52:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:52:01 compute-0 openstack_network_exporter[199746]: ERROR   23:52:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:52:01 compute-0 openstack_network_exporter[199746]: ERROR   23:52:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:52:01 compute-0 openstack_network_exporter[199746]: ERROR   23:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:52:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:52:01 compute-0 openstack_network_exporter[199746]: ERROR   23:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:52:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:52:02 compute-0 podman[210189]: 2025-12-02 23:52:02.1654735 +0000 UTC m=+0.112395542 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 23:52:03 compute-0 nova_compute[187243]: 2025-12-02 23:52:03.128 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:03 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:03.923 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5c:5a:4d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2ed5d5a2-c53b-47f2-94f0-955fc91515bf', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ed5d5a2-c53b-47f2-94f0-955fc91515bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059ee4b8b9ab47ffbc539c03339a4112', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7f5fe24-1162-482d-92ac-517788927a8f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9c0c9072-9900-46a2-b66d-c5e8a478205d) old=Port_Binding(mac=['fa:16:3e:5c:5a:4d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-2ed5d5a2-c53b-47f2-94f0-955fc91515bf', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ed5d5a2-c53b-47f2-94f0-955fc91515bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059ee4b8b9ab47ffbc539c03339a4112', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:52:03 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:03.924 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9c0c9072-9900-46a2-b66d-c5e8a478205d in datapath 2ed5d5a2-c53b-47f2-94f0-955fc91515bf updated
Dec 02 23:52:03 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:03.925 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2ed5d5a2-c53b-47f2-94f0-955fc91515bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:52:03 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:03.926 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1121925d-5fda-444f-82df-e732e8a196a2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:05 compute-0 nova_compute[187243]: 2025-12-02 23:52:05.508 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:07 compute-0 podman[210210]: 2025-12-02 23:52:07.143996473 +0000 UTC m=+0.092883590 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:52:08 compute-0 nova_compute[187243]: 2025-12-02 23:52:08.130 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:10 compute-0 nova_compute[187243]: 2025-12-02 23:52:10.510 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:13 compute-0 nova_compute[187243]: 2025-12-02 23:52:13.132 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:13 compute-0 nova_compute[187243]: 2025-12-02 23:52:13.946 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "11673f94-0590-4a0b-a344-0dfac27faf87" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:13 compute-0 nova_compute[187243]: 2025-12-02 23:52:13.946 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:14 compute-0 nova_compute[187243]: 2025-12-02 23:52:14.453 187247 DEBUG nova.compute.manager [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 02 23:52:15 compute-0 sshd-session[210230]: Invalid user adam from 102.210.148.92 port 50824
Dec 02 23:52:15 compute-0 nova_compute[187243]: 2025-12-02 23:52:15.512 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:15 compute-0 sshd-session[210230]: Received disconnect from 102.210.148.92 port 50824:11: Bye Bye [preauth]
Dec 02 23:52:15 compute-0 sshd-session[210230]: Disconnected from invalid user adam 102.210.148.92 port 50824 [preauth]
Dec 02 23:52:16 compute-0 nova_compute[187243]: 2025-12-02 23:52:16.075 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:16 compute-0 nova_compute[187243]: 2025-12-02 23:52:16.076 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:16 compute-0 nova_compute[187243]: 2025-12-02 23:52:16.087 187247 DEBUG nova.virt.hardware [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 02 23:52:16 compute-0 nova_compute[187243]: 2025-12-02 23:52:16.087 187247 INFO nova.compute.claims [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Claim successful on node compute-0.ctlplane.example.com
Dec 02 23:52:16 compute-0 ovn_controller[95488]: 2025-12-02T23:52:16Z|00048|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 02 23:52:17 compute-0 podman[210232]: 2025-12-02 23:52:17.131619577 +0000 UTC m=+0.089201280 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:52:17 compute-0 nova_compute[187243]: 2025-12-02 23:52:17.160 187247 DEBUG nova.compute.provider_tree [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:52:17 compute-0 nova_compute[187243]: 2025-12-02 23:52:17.672 187247 DEBUG nova.scheduler.client.report [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:52:18 compute-0 nova_compute[187243]: 2025-12-02 23:52:18.134 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:18 compute-0 nova_compute[187243]: 2025-12-02 23:52:18.186 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:18 compute-0 nova_compute[187243]: 2025-12-02 23:52:18.187 187247 DEBUG nova.compute.manager [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 02 23:52:18 compute-0 nova_compute[187243]: 2025-12-02 23:52:18.700 187247 DEBUG nova.compute.manager [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 02 23:52:18 compute-0 nova_compute[187243]: 2025-12-02 23:52:18.701 187247 DEBUG nova.network.neutron [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 02 23:52:18 compute-0 nova_compute[187243]: 2025-12-02 23:52:18.701 187247 WARNING neutronclient.v2_0.client [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:52:18 compute-0 nova_compute[187243]: 2025-12-02 23:52:18.702 187247 WARNING neutronclient.v2_0.client [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:52:19 compute-0 nova_compute[187243]: 2025-12-02 23:52:19.210 187247 INFO nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 02 23:52:19 compute-0 nova_compute[187243]: 2025-12-02 23:52:19.237 187247 DEBUG nova.network.neutron [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Successfully created port: 13cba066-1d7a-4449-8510-04cbf6aeed5f _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 02 23:52:19 compute-0 nova_compute[187243]: 2025-12-02 23:52:19.718 187247 DEBUG nova.compute.manager [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.173 187247 DEBUG nova.network.neutron [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Successfully updated port: 13cba066-1d7a-4449-8510-04cbf6aeed5f _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.244 187247 DEBUG nova.compute.manager [req-3aabddcb-1aa6-4ae5-afcf-3ec9796ee801 req-2459db1d-492e-4f5f-8fff-553a0b24a8db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Received event network-changed-13cba066-1d7a-4449-8510-04cbf6aeed5f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.244 187247 DEBUG nova.compute.manager [req-3aabddcb-1aa6-4ae5-afcf-3ec9796ee801 req-2459db1d-492e-4f5f-8fff-553a0b24a8db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Refreshing instance network info cache due to event network-changed-13cba066-1d7a-4449-8510-04cbf6aeed5f. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.245 187247 DEBUG oslo_concurrency.lockutils [req-3aabddcb-1aa6-4ae5-afcf-3ec9796ee801 req-2459db1d-492e-4f5f-8fff-553a0b24a8db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-11673f94-0590-4a0b-a344-0dfac27faf87" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.245 187247 DEBUG oslo_concurrency.lockutils [req-3aabddcb-1aa6-4ae5-afcf-3ec9796ee801 req-2459db1d-492e-4f5f-8fff-553a0b24a8db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-11673f94-0590-4a0b-a344-0dfac27faf87" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.245 187247 DEBUG nova.network.neutron [req-3aabddcb-1aa6-4ae5-afcf-3ec9796ee801 req-2459db1d-492e-4f5f-8fff-553a0b24a8db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Refreshing network info cache for port 13cba066-1d7a-4449-8510-04cbf6aeed5f _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.513 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.679 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "refresh_cache-11673f94-0590-4a0b-a344-0dfac27faf87" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.736 187247 DEBUG nova.compute.manager [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.737 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.737 187247 INFO nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Creating image(s)
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.738 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "/var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.738 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "/var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.739 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "/var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.739 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.743 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.744 187247 DEBUG oslo_concurrency.processutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.752 187247 WARNING neutronclient.v2_0.client [req-3aabddcb-1aa6-4ae5-afcf-3ec9796ee801 req-2459db1d-492e-4f5f-8fff-553a0b24a8db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.797 187247 DEBUG oslo_concurrency.processutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.798 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.798 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.799 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.802 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.803 187247 DEBUG oslo_concurrency.processutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.853 187247 DEBUG oslo_concurrency.processutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.854 187247 DEBUG oslo_concurrency.processutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.887 187247 DEBUG oslo_concurrency.processutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.888 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.889 187247 DEBUG oslo_concurrency.processutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.939 187247 DEBUG oslo_concurrency.processutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.940 187247 DEBUG nova.virt.disk.api [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Checking if we can resize image /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.941 187247 DEBUG oslo_concurrency.processutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.989 187247 DEBUG oslo_concurrency.processutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.990 187247 DEBUG nova.virt.disk.api [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Cannot resize image /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.990 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.990 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Ensure instance console log exists: /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.991 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.991 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:20 compute-0 nova_compute[187243]: 2025-12-02 23:52:20.991 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:21 compute-0 nova_compute[187243]: 2025-12-02 23:52:21.072 187247 DEBUG nova.network.neutron [req-3aabddcb-1aa6-4ae5-afcf-3ec9796ee801 req-2459db1d-492e-4f5f-8fff-553a0b24a8db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:52:21 compute-0 podman[210274]: 2025-12-02 23:52:21.089351365 +0000 UTC m=+0.046537337 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:52:21 compute-0 podman[210275]: 2025-12-02 23:52:21.15108478 +0000 UTC m=+0.102331569 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 02 23:52:21 compute-0 nova_compute[187243]: 2025-12-02 23:52:21.214 187247 DEBUG nova.network.neutron [req-3aabddcb-1aa6-4ae5-afcf-3ec9796ee801 req-2459db1d-492e-4f5f-8fff-553a0b24a8db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:52:21 compute-0 nova_compute[187243]: 2025-12-02 23:52:21.721 187247 DEBUG oslo_concurrency.lockutils [req-3aabddcb-1aa6-4ae5-afcf-3ec9796ee801 req-2459db1d-492e-4f5f-8fff-553a0b24a8db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-11673f94-0590-4a0b-a344-0dfac27faf87" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:52:21 compute-0 nova_compute[187243]: 2025-12-02 23:52:21.722 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquired lock "refresh_cache-11673f94-0590-4a0b-a344-0dfac27faf87" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:52:21 compute-0 nova_compute[187243]: 2025-12-02 23:52:21.722 187247 DEBUG nova.network.neutron [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:52:21 compute-0 sshd-session[210257]: Received disconnect from 49.247.36.49 port 18978:11: Bye Bye [preauth]
Dec 02 23:52:21 compute-0 sshd-session[210257]: Disconnected from authenticating user root 49.247.36.49 port 18978 [preauth]
Dec 02 23:52:23 compute-0 nova_compute[187243]: 2025-12-02 23:52:23.039 187247 DEBUG nova.network.neutron [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:52:23 compute-0 nova_compute[187243]: 2025-12-02 23:52:23.102 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:23 compute-0 nova_compute[187243]: 2025-12-02 23:52:23.103 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 02 23:52:23 compute-0 nova_compute[187243]: 2025-12-02 23:52:23.136 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:23 compute-0 nova_compute[187243]: 2025-12-02 23:52:23.739 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 02 23:52:23 compute-0 nova_compute[187243]: 2025-12-02 23:52:23.740 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:23 compute-0 nova_compute[187243]: 2025-12-02 23:52:23.740 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.023 187247 WARNING neutronclient.v2_0.client [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.220 187247 DEBUG nova.network.neutron [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Updating instance_info_cache with network_info: [{"id": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "address": "fa:16:3e:b7:db:88", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cba066-1d", "ovs_interfaceid": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.726 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Releasing lock "refresh_cache-11673f94-0590-4a0b-a344-0dfac27faf87" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.727 187247 DEBUG nova.compute.manager [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Instance network_info: |[{"id": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "address": "fa:16:3e:b7:db:88", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cba066-1d", "ovs_interfaceid": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.729 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Start _get_guest_xml network_info=[{"id": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "address": "fa:16:3e:b7:db:88", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cba066-1d", "ovs_interfaceid": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.733 187247 WARNING nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.734 187247 DEBUG nova.virt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestDataModel-server-1017793270', uuid='11673f94-0590-4a0b-a344-0dfac27faf87'), owner=OwnerMeta(userid='d032790eea2c4094b69ea4a2576bff68', username='tempest-TestDataModel-1253061916-project-admin', projectid='059ee4b8b9ab47ffbc539c03339a4112', projectname='tempest-TestDataModel-1253061916'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "address": "fa:16:3e:b7:db:88", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cba066-1d", "ovs_interfaceid": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764719544.7342439) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.738 187247 DEBUG nova.virt.libvirt.host [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.739 187247 DEBUG nova.virt.libvirt.host [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.741 187247 DEBUG nova.virt.libvirt.host [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.742 187247 DEBUG nova.virt.libvirt.host [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.743 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.743 187247 DEBUG nova.virt.hardware [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.743 187247 DEBUG nova.virt.hardware [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.744 187247 DEBUG nova.virt.hardware [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.744 187247 DEBUG nova.virt.hardware [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.744 187247 DEBUG nova.virt.hardware [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.744 187247 DEBUG nova.virt.hardware [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.744 187247 DEBUG nova.virt.hardware [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.745 187247 DEBUG nova.virt.hardware [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.745 187247 DEBUG nova.virt.hardware [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.745 187247 DEBUG nova.virt.hardware [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.745 187247 DEBUG nova.virt.hardware [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.749 187247 DEBUG nova.virt.libvirt.vif [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-02T23:52:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1017793270',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1017793270',id=2,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='059ee4b8b9ab47ffbc539c03339a4112',ramdisk_id='',reservation_id='r-6r0rl3jv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-1253061916',owner_user_name='tempest-TestDataModel-1253061916-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:52:19Z,user_data=None,user_id='d032790eea2c4094b69ea4a2576bff68',uuid=11673f94-0590-4a0b-a344-0dfac27faf87,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "address": "fa:16:3e:b7:db:88", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cba066-1d", "ovs_interfaceid": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.749 187247 DEBUG nova.network.os_vif_util [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Converting VIF {"id": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "address": "fa:16:3e:b7:db:88", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cba066-1d", "ovs_interfaceid": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.750 187247 DEBUG nova.network.os_vif_util [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:db:88,bridge_name='br-int',has_traffic_filtering=True,id=13cba066-1d7a-4449-8510-04cbf6aeed5f,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13cba066-1d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:52:24 compute-0 nova_compute[187243]: 2025-12-02 23:52:24.750 187247 DEBUG nova.objects.instance [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lazy-loading 'pci_devices' on Instance uuid 11673f94-0590-4a0b-a344-0dfac27faf87 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.260 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] End _get_guest_xml xml=<domain type="kvm">
Dec 02 23:52:25 compute-0 nova_compute[187243]:   <uuid>11673f94-0590-4a0b-a344-0dfac27faf87</uuid>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   <name>instance-00000002</name>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   <metadata>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <nova:name>tempest-TestDataModel-server-1017793270</nova:name>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-02 23:52:24</nova:creationTime>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 02 23:52:25 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:properties>
Dec 02 23:52:25 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         </nova:properties>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       </nova:image>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <nova:owner>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:user uuid="d032790eea2c4094b69ea4a2576bff68">tempest-TestDataModel-1253061916-project-admin</nova:user>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:project uuid="059ee4b8b9ab47ffbc539c03339a4112">tempest-TestDataModel-1253061916</nova:project>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       </nova:owner>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <nova:ports>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         <nova:port uuid="13cba066-1d7a-4449-8510-04cbf6aeed5f">
Dec 02 23:52:25 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:         </nova:port>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       </nova:ports>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     </nova:instance>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   </metadata>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <system>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <entry name="serial">11673f94-0590-4a0b-a344-0dfac27faf87</entry>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <entry name="uuid">11673f94-0590-4a0b-a344-0dfac27faf87</entry>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     </system>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   </sysinfo>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   <os>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   </os>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   <features>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <acpi/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <apic/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   </features>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   </clock>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   </cpu>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   <devices>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk.config"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:b7:db:88"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <target dev="tap13cba066-1d"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     </interface>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/console.log" append="off"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     </serial>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <video>
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     </video>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     </rng>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:52:25 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 02 23:52:25 compute-0 nova_compute[187243]:     </memballoon>
Dec 02 23:52:25 compute-0 nova_compute[187243]:   </devices>
Dec 02 23:52:25 compute-0 nova_compute[187243]: </domain>
Dec 02 23:52:25 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.262 187247 DEBUG nova.compute.manager [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Preparing to wait for external event network-vif-plugged-13cba066-1d7a-4449-8510-04cbf6aeed5f prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.263 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.263 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.263 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.264 187247 DEBUG nova.virt.libvirt.vif [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-02T23:52:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1017793270',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1017793270',id=2,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='059ee4b8b9ab47ffbc539c03339a4112',ramdisk_id='',reservation_id='r-6r0rl3jv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-1253061916',owner_user_name='tempest-TestDataModel-1253061916-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:52:19Z,user_data=None,user_id='d032790eea2c4094b69ea4a2576bff68',uuid=11673f94-0590-4a0b-a344-0dfac27faf87,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "address": "fa:16:3e:b7:db:88", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cba066-1d", "ovs_interfaceid": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.265 187247 DEBUG nova.network.os_vif_util [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Converting VIF {"id": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "address": "fa:16:3e:b7:db:88", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cba066-1d", "ovs_interfaceid": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.266 187247 DEBUG nova.network.os_vif_util [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:db:88,bridge_name='br-int',has_traffic_filtering=True,id=13cba066-1d7a-4449-8510-04cbf6aeed5f,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13cba066-1d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.266 187247 DEBUG os_vif [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:db:88,bridge_name='br-int',has_traffic_filtering=True,id=13cba066-1d7a-4449-8510-04cbf6aeed5f,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13cba066-1d') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.267 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.268 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.268 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.269 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.269 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'be878b50-9b36-5f7e-8bc4-402f3659f566', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.271 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.274 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.277 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.278 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13cba066-1d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.278 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap13cba066-1d, col_values=(('qos', UUID('599dfd74-b74a-4905-b552-f21706daa464')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.279 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap13cba066-1d, col_values=(('external_ids', {'iface-id': '13cba066-1d7a-4449-8510-04cbf6aeed5f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:db:88', 'vm-uuid': '11673f94-0590-4a0b-a344-0dfac27faf87'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.280 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:25 compute-0 NetworkManager[55671]: <info>  [1764719545.2820] manager: (tap13cba066-1d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.283 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.288 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.289 187247 INFO os_vif [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:db:88,bridge_name='br-int',has_traffic_filtering=True,id=13cba066-1d7a-4449-8510-04cbf6aeed5f,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13cba066-1d')
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.515 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.734 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.734 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.735 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:25 compute-0 nova_compute[187243]: 2025-12-02 23:52:25.735 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:52:26 compute-0 sshd-session[210323]: Received disconnect from 23.95.37.90 port 51524:11: Bye Bye [preauth]
Dec 02 23:52:26 compute-0 sshd-session[210323]: Disconnected from authenticating user root 23.95.37.90 port 51524 [preauth]
Dec 02 23:52:26 compute-0 nova_compute[187243]: 2025-12-02 23:52:26.831 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:52:26 compute-0 nova_compute[187243]: 2025-12-02 23:52:26.832 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:52:26 compute-0 nova_compute[187243]: 2025-12-02 23:52:26.833 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] No VIF found with MAC fa:16:3e:b7:db:88, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 02 23:52:26 compute-0 nova_compute[187243]: 2025-12-02 23:52:26.834 187247 INFO nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Using config drive
Dec 02 23:52:27 compute-0 nova_compute[187243]: 2025-12-02 23:52:27.415 187247 WARNING neutronclient.v2_0.client [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:52:27 compute-0 nova_compute[187243]: 2025-12-02 23:52:27.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:27 compute-0 nova_compute[187243]: 2025-12-02 23:52:27.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:27 compute-0 nova_compute[187243]: 2025-12-02 23:52:27.984 187247 INFO nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Creating config drive at /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk.config
Dec 02 23:52:27 compute-0 nova_compute[187243]: 2025-12-02 23:52:27.994 187247 DEBUG oslo_concurrency.processutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp0woue1eh execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.134 187247 DEBUG oslo_concurrency.processutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp0woue1eh" returned: 0 in 0.140s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:28 compute-0 kernel: tap13cba066-1d: entered promiscuous mode
Dec 02 23:52:28 compute-0 NetworkManager[55671]: <info>  [1764719548.2087] manager: (tap13cba066-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Dec 02 23:52:28 compute-0 ovn_controller[95488]: 2025-12-02T23:52:28Z|00049|binding|INFO|Claiming lport 13cba066-1d7a-4449-8510-04cbf6aeed5f for this chassis.
Dec 02 23:52:28 compute-0 ovn_controller[95488]: 2025-12-02T23:52:28Z|00050|binding|INFO|13cba066-1d7a-4449-8510-04cbf6aeed5f: Claiming fa:16:3e:b7:db:88 10.100.0.4
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.243 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.249 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:28 compute-0 systemd-udevd[210342]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:52:28 compute-0 systemd-machined[153518]: New machine qemu-2-instance-00000002.
Dec 02 23:52:28 compute-0 NetworkManager[55671]: <info>  [1764719548.2926] device (tap13cba066-1d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:52:28 compute-0 NetworkManager[55671]: <info>  [1764719548.2938] device (tap13cba066-1d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.318 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:db:88 10.100.0.4'], port_security=['fa:16:3e:b7:db:88 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '11673f94-0590-4a0b-a344-0dfac27faf87', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059ee4b8b9ab47ffbc539c03339a4112', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9f795f84-34cd-4429-87ae-cc3f6a2442e6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0f7b0db-e402-4898-a011-4ec4b4fff19a, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=13cba066-1d7a-4449-8510-04cbf6aeed5f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.319 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 13cba066-1d7a-4449-8510-04cbf6aeed5f in datapath 2c29168d-89f5-4fdd-a1dd-76c0a34cef80 bound to our chassis
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.321 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2c29168d-89f5-4fdd-a1dd-76c0a34cef80
Dec 02 23:52:28 compute-0 ovn_controller[95488]: 2025-12-02T23:52:28Z|00051|binding|INFO|Setting lport 13cba066-1d7a-4449-8510-04cbf6aeed5f ovn-installed in OVS
Dec 02 23:52:28 compute-0 ovn_controller[95488]: 2025-12-02T23:52:28Z|00052|binding|INFO|Setting lport 13cba066-1d7a-4449-8510-04cbf6aeed5f up in Southbound
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.326 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:28 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.339 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1691c88b-eca6-40b7-af7d-c52f1d7ceb6f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.340 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2c29168d-81 in ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.349 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2c29168d-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.349 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[73effab1-6e73-41e0-9134-87d60b9fa33d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.350 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[beba0cda-84d3-439a-b1cc-e9272acbb3b2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.368 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1f41ef-6a66-4867-b135-dbb7288a6797]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.375 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[779a331c-8c7b-4593-be0c-259957731064]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.417 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[bd4a5633-086d-4ae4-a2b7-66d60e5c7a4f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.421 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6eb5d30b-d74e-4e48-8368-d664b3921a1c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 NetworkManager[55671]: <info>  [1764719548.4231] manager: (tap2c29168d-80): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Dec 02 23:52:28 compute-0 systemd-udevd[210345]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.462 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[1c0b14b4-3211-4001-8efd-2516f66f199b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.465 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[77fff11c-2b30-4809-8c7e-1cf5ba989983]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 NetworkManager[55671]: <info>  [1764719548.4941] device (tap2c29168d-80): carrier: link connected
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.501 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[4854adc7-115d-4fa0-b910-0acc42f27d98]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.515 187247 DEBUG nova.compute.manager [req-e9c34386-d6ee-448b-b880-4c0a06c1d1b5 req-ffddcd4a-100c-4908-9902-fc25aa4056c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Received event network-vif-plugged-13cba066-1d7a-4449-8510-04cbf6aeed5f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.515 187247 DEBUG oslo_concurrency.lockutils [req-e9c34386-d6ee-448b-b880-4c0a06c1d1b5 req-ffddcd4a-100c-4908-9902-fc25aa4056c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.516 187247 DEBUG oslo_concurrency.lockutils [req-e9c34386-d6ee-448b-b880-4c0a06c1d1b5 req-ffddcd4a-100c-4908-9902-fc25aa4056c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.516 187247 DEBUG oslo_concurrency.lockutils [req-e9c34386-d6ee-448b-b880-4c0a06c1d1b5 req-ffddcd4a-100c-4908-9902-fc25aa4056c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.516 187247 DEBUG nova.compute.manager [req-e9c34386-d6ee-448b-b880-4c0a06c1d1b5 req-ffddcd4a-100c-4908-9902-fc25aa4056c4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Processing event network-vif-plugged-13cba066-1d7a-4449-8510-04cbf6aeed5f _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.522 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[17c61059-910f-4d14-8150-f4780580db1a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2c29168d-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:56:40:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363913, 'reachable_time': 31315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210376, 'error': None, 'target': 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.546 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1c231b1d-21ae-4d0a-ac22-12500b7213ee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe56:40c1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363913, 'tstamp': 363913}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210377, 'error': None, 'target': 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.575 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c5c9f24b-2331-4b39-9087-3c124c7e8bb5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2c29168d-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:56:40:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363913, 'reachable_time': 31315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210378, 'error': None, 'target': 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.622 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b4004402-825a-4ad9-8f66-5e78cbf496f3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.707 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f5296abf-53ac-4758-b60f-bacb7cef471f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.708 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c29168d-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.708 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.709 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c29168d-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.710 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:28 compute-0 NetworkManager[55671]: <info>  [1764719548.7115] manager: (tap2c29168d-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Dec 02 23:52:28 compute-0 kernel: tap2c29168d-80: entered promiscuous mode
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.713 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.715 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2c29168d-80, col_values=(('external_ids', {'iface-id': '64808fc3-a961-45a4-b901-b4f1049f2d12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.716 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:28 compute-0 ovn_controller[95488]: 2025-12-02T23:52:28Z|00053|binding|INFO|Releasing lport 64808fc3-a961-45a4-b901-b4f1049f2d12 from this chassis (sb_readonly=0)
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.740 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:28 compute-0 nova_compute[187243]: 2025-12-02 23:52:28.743 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.745 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[134e511e-2f38-40c8-9c63-3251da0cc1e5]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.746 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.747 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.747 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 2c29168d-89f5-4fdd-a1dd-76c0a34cef80 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.747 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.748 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a3933749-78a0-4e80-9918-7dabce933585]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.749 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.750 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4c2467fb-c2b6-4f7c-903a-b4e90e753b06]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.751 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: global
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-2c29168d-89f5-4fdd-a1dd-76c0a34cef80
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: defaults
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     log global
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID 2c29168d-89f5-4fdd-a1dd-76c0a34cef80
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 02 23:52:28 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:28.756 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'env', 'PROCESS_TAG=haproxy-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.085 187247 DEBUG nova.compute.manager [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.091 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.096 187247 INFO nova.virt.libvirt.driver [-] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Instance spawned successfully.
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.097 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.106 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.107 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.107 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.108 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:52:29 compute-0 podman[210414]: 2025-12-02 23:52:29.240436227 +0000 UTC m=+0.068490160 container create 8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:52:29 compute-0 systemd[1]: Started libpod-conmon-8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262.scope.
Dec 02 23:52:29 compute-0 podman[210414]: 2025-12-02 23:52:29.204904016 +0000 UTC m=+0.032958029 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 02 23:52:29 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:52:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d93b00f6b466acbf89f8cf3cda79d04ed486c7890edb09137a63493bb62a03b3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 23:52:29 compute-0 podman[210414]: 2025-12-02 23:52:29.351491355 +0000 UTC m=+0.179545348 container init 8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Dec 02 23:52:29 compute-0 podman[210414]: 2025-12-02 23:52:29.357597633 +0000 UTC m=+0.185651576 container start 8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:52:29 compute-0 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[210430]: [NOTICE]   (210434) : New worker (210436) forked
Dec 02 23:52:29 compute-0 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[210430]: [NOTICE]   (210434) : Loading success.
Dec 02 23:52:29 compute-0 podman[197600]: time="2025-12-02T23:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:52:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18287 "" "Go-http-client/1.1"
Dec 02 23:52:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3045 "" "Go-http-client/1.1"
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.769 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.770 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.770 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.771 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.771 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:52:29 compute-0 nova_compute[187243]: 2025-12-02 23:52:29.771 187247 DEBUG nova.virt.libvirt.driver [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:52:30 compute-0 nova_compute[187243]: 2025-12-02 23:52:30.282 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:30 compute-0 nova_compute[187243]: 2025-12-02 23:52:30.297 187247 INFO nova.compute.manager [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Took 9.56 seconds to spawn the instance on the hypervisor.
Dec 02 23:52:30 compute-0 nova_compute[187243]: 2025-12-02 23:52:30.297 187247 DEBUG nova.compute.manager [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 02 23:52:30 compute-0 nova_compute[187243]: 2025-12-02 23:52:30.517 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:30 compute-0 nova_compute[187243]: 2025-12-02 23:52:30.765 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:30 compute-0 nova_compute[187243]: 2025-12-02 23:52:30.837 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:30 compute-0 nova_compute[187243]: 2025-12-02 23:52:30.838 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:30 compute-0 nova_compute[187243]: 2025-12-02 23:52:30.894 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:31 compute-0 nova_compute[187243]: 2025-12-02 23:52:31.049 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:52:31 compute-0 nova_compute[187243]: 2025-12-02 23:52:31.050 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:52:31 compute-0 nova_compute[187243]: 2025-12-02 23:52:31.068 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:52:31 compute-0 nova_compute[187243]: 2025-12-02 23:52:31.069 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5761MB free_disk=73.16846466064453GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:52:31 compute-0 nova_compute[187243]: 2025-12-02 23:52:31.069 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:31 compute-0 nova_compute[187243]: 2025-12-02 23:52:31.069 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:31 compute-0 nova_compute[187243]: 2025-12-02 23:52:31.356 187247 DEBUG nova.compute.manager [req-3af24e4b-d096-4821-83f8-cc62dcb0feb9 req-b19e4534-1618-4bb5-a5d4-24b1036e9a89 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Received event network-vif-plugged-13cba066-1d7a-4449-8510-04cbf6aeed5f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:52:31 compute-0 nova_compute[187243]: 2025-12-02 23:52:31.357 187247 DEBUG oslo_concurrency.lockutils [req-3af24e4b-d096-4821-83f8-cc62dcb0feb9 req-b19e4534-1618-4bb5-a5d4-24b1036e9a89 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:52:31 compute-0 nova_compute[187243]: 2025-12-02 23:52:31.357 187247 DEBUG oslo_concurrency.lockutils [req-3af24e4b-d096-4821-83f8-cc62dcb0feb9 req-b19e4534-1618-4bb5-a5d4-24b1036e9a89 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:52:31 compute-0 nova_compute[187243]: 2025-12-02 23:52:31.358 187247 DEBUG oslo_concurrency.lockutils [req-3af24e4b-d096-4821-83f8-cc62dcb0feb9 req-b19e4534-1618-4bb5-a5d4-24b1036e9a89 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:31 compute-0 nova_compute[187243]: 2025-12-02 23:52:31.358 187247 DEBUG nova.compute.manager [req-3af24e4b-d096-4821-83f8-cc62dcb0feb9 req-b19e4534-1618-4bb5-a5d4-24b1036e9a89 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] No waiting events found dispatching network-vif-plugged-13cba066-1d7a-4449-8510-04cbf6aeed5f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:52:31 compute-0 nova_compute[187243]: 2025-12-02 23:52:31.358 187247 WARNING nova.compute.manager [req-3af24e4b-d096-4821-83f8-cc62dcb0feb9 req-b19e4534-1618-4bb5-a5d4-24b1036e9a89 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Received unexpected event network-vif-plugged-13cba066-1d7a-4449-8510-04cbf6aeed5f for instance with vm_state building and task_state spawning.
Dec 02 23:52:31 compute-0 openstack_network_exporter[199746]: ERROR   23:52:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:52:31 compute-0 openstack_network_exporter[199746]: ERROR   23:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:52:31 compute-0 openstack_network_exporter[199746]: ERROR   23:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:52:31 compute-0 openstack_network_exporter[199746]: ERROR   23:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:52:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:52:31 compute-0 openstack_network_exporter[199746]: ERROR   23:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:52:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:52:31 compute-0 nova_compute[187243]: 2025-12-02 23:52:31.964 187247 INFO nova.compute.manager [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Took 15.94 seconds to build instance.
Dec 02 23:52:32 compute-0 nova_compute[187243]: 2025-12-02 23:52:32.472 187247 DEBUG oslo_concurrency.lockutils [None req-cbcce3f9-62b7-4f63-95a8-efa41a6525ef d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.525s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:32 compute-0 nova_compute[187243]: 2025-12-02 23:52:32.489 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance 11673f94-0590-4a0b-a344-0dfac27faf87 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:52:32 compute-0 nova_compute[187243]: 2025-12-02 23:52:32.490 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:52:32 compute-0 nova_compute[187243]: 2025-12-02 23:52:32.490 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:52:31 up  1:00,  0 user,  load average: 0.32, 0.37, 0.48\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_059ee4b8b9ab47ffbc539c03339a4112': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:52:32 compute-0 nova_compute[187243]: 2025-12-02 23:52:32.551 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:52:33 compute-0 nova_compute[187243]: 2025-12-02 23:52:33.059 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:52:33 compute-0 podman[210452]: 2025-12-02 23:52:33.115166145 +0000 UTC m=+0.070990180 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, name=ubi9-minimal, config_id=edpm, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 23:52:33 compute-0 nova_compute[187243]: 2025-12-02 23:52:33.579 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:52:33 compute-0 nova_compute[187243]: 2025-12-02 23:52:33.580 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.511s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:52:34 compute-0 nova_compute[187243]: 2025-12-02 23:52:34.577 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:34 compute-0 nova_compute[187243]: 2025-12-02 23:52:34.578 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:35 compute-0 nova_compute[187243]: 2025-12-02 23:52:35.096 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:52:35 compute-0 nova_compute[187243]: 2025-12-02 23:52:35.321 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:35 compute-0 nova_compute[187243]: 2025-12-02 23:52:35.519 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:38 compute-0 podman[210474]: 2025-12-02 23:52:38.118487418 +0000 UTC m=+0.071707708 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 02 23:52:40 compute-0 nova_compute[187243]: 2025-12-02 23:52:40.325 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:40 compute-0 nova_compute[187243]: 2025-12-02 23:52:40.523 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:42 compute-0 ovn_controller[95488]: 2025-12-02T23:52:42Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b7:db:88 10.100.0.4
Dec 02 23:52:42 compute-0 ovn_controller[95488]: 2025-12-02T23:52:42Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b7:db:88 10.100.0.4
Dec 02 23:52:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:45.068 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:52:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:45.069 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:52:45 compute-0 nova_compute[187243]: 2025-12-02 23:52:45.112 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:45 compute-0 nova_compute[187243]: 2025-12-02 23:52:45.328 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:45 compute-0 nova_compute[187243]: 2025-12-02 23:52:45.526 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:48 compute-0 podman[210510]: 2025-12-02 23:52:48.139135141 +0000 UTC m=+0.075908229 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:52:49 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:52:49.071 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:52:49 compute-0 sshd-session[210494]: Received disconnect from 45.78.222.160 port 42690:11: Bye Bye [preauth]
Dec 02 23:52:49 compute-0 sshd-session[210494]: Disconnected from 45.78.222.160 port 42690 [preauth]
Dec 02 23:52:50 compute-0 nova_compute[187243]: 2025-12-02 23:52:50.331 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:50 compute-0 nova_compute[187243]: 2025-12-02 23:52:50.530 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:52 compute-0 podman[210534]: 2025-12-02 23:52:52.104139955 +0000 UTC m=+0.057290908 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 02 23:52:52 compute-0 podman[210535]: 2025-12-02 23:52:52.191288205 +0000 UTC m=+0.128657016 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:52:55 compute-0 nova_compute[187243]: 2025-12-02 23:52:55.334 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:55 compute-0 nova_compute[187243]: 2025-12-02 23:52:55.532 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:52:59 compute-0 podman[197600]: time="2025-12-02T23:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:52:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18287 "" "Go-http-client/1.1"
Dec 02 23:52:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3054 "" "Go-http-client/1.1"
Dec 02 23:53:00 compute-0 nova_compute[187243]: 2025-12-02 23:53:00.337 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:00 compute-0 nova_compute[187243]: 2025-12-02 23:53:00.534 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:00.673 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:00.674 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:00.675 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:01 compute-0 openstack_network_exporter[199746]: ERROR   23:53:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:53:01 compute-0 openstack_network_exporter[199746]: ERROR   23:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:53:01 compute-0 openstack_network_exporter[199746]: ERROR   23:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:53:01 compute-0 openstack_network_exporter[199746]: ERROR   23:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:53:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:53:01 compute-0 openstack_network_exporter[199746]: ERROR   23:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:53:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:53:04 compute-0 podman[210584]: 2025-12-02 23:53:04.12332864 +0000 UTC m=+0.076266732 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, distribution-scope=public)
Dec 02 23:53:04 compute-0 sshd-session[210582]: Received disconnect from 20.123.120.169 port 38514:11: Bye Bye [preauth]
Dec 02 23:53:04 compute-0 sshd-session[210582]: Disconnected from authenticating user root 20.123.120.169 port 38514 [preauth]
Dec 02 23:53:05 compute-0 nova_compute[187243]: 2025-12-02 23:53:05.340 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:05 compute-0 nova_compute[187243]: 2025-12-02 23:53:05.535 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:09 compute-0 podman[210605]: 2025-12-02 23:53:09.156391304 +0000 UTC m=+0.089439772 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:53:10 compute-0 nova_compute[187243]: 2025-12-02 23:53:10.343 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:10 compute-0 nova_compute[187243]: 2025-12-02 23:53:10.537 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:15 compute-0 nova_compute[187243]: 2025-12-02 23:53:15.346 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:15 compute-0 nova_compute[187243]: 2025-12-02 23:53:15.541 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:15 compute-0 nova_compute[187243]: 2025-12-02 23:53:15.817 187247 DEBUG oslo_concurrency.lockutils [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "11673f94-0590-4a0b-a344-0dfac27faf87" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:15 compute-0 nova_compute[187243]: 2025-12-02 23:53:15.818 187247 DEBUG oslo_concurrency.lockutils [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:15 compute-0 nova_compute[187243]: 2025-12-02 23:53:15.818 187247 DEBUG oslo_concurrency.lockutils [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:15 compute-0 nova_compute[187243]: 2025-12-02 23:53:15.819 187247 DEBUG oslo_concurrency.lockutils [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:15 compute-0 nova_compute[187243]: 2025-12-02 23:53:15.819 187247 DEBUG oslo_concurrency.lockutils [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:15 compute-0 nova_compute[187243]: 2025-12-02 23:53:15.977 187247 INFO nova.compute.manager [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Terminating instance
Dec 02 23:53:16 compute-0 nova_compute[187243]: 2025-12-02 23:53:16.495 187247 DEBUG nova.compute.manager [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 02 23:53:16 compute-0 kernel: tap13cba066-1d (unregistering): left promiscuous mode
Dec 02 23:53:16 compute-0 NetworkManager[55671]: <info>  [1764719596.5214] device (tap13cba066-1d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 02 23:53:16 compute-0 nova_compute[187243]: 2025-12-02 23:53:16.527 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:16 compute-0 ovn_controller[95488]: 2025-12-02T23:53:16Z|00054|binding|INFO|Releasing lport 13cba066-1d7a-4449-8510-04cbf6aeed5f from this chassis (sb_readonly=0)
Dec 02 23:53:16 compute-0 ovn_controller[95488]: 2025-12-02T23:53:16Z|00055|binding|INFO|Setting lport 13cba066-1d7a-4449-8510-04cbf6aeed5f down in Southbound
Dec 02 23:53:16 compute-0 ovn_controller[95488]: 2025-12-02T23:53:16Z|00056|binding|INFO|Removing iface tap13cba066-1d ovn-installed in OVS
Dec 02 23:53:16 compute-0 nova_compute[187243]: 2025-12-02 23:53:16.529 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.534 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:db:88 10.100.0.4'], port_security=['fa:16:3e:b7:db:88 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '11673f94-0590-4a0b-a344-0dfac27faf87', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '059ee4b8b9ab47ffbc539c03339a4112', 'neutron:revision_number': '5', 'neutron:security_group_ids': '9f795f84-34cd-4429-87ae-cc3f6a2442e6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0f7b0db-e402-4898-a011-4ec4b4fff19a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=13cba066-1d7a-4449-8510-04cbf6aeed5f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.536 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 13cba066-1d7a-4449-8510-04cbf6aeed5f in datapath 2c29168d-89f5-4fdd-a1dd-76c0a34cef80 unbound from our chassis
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.537 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2c29168d-89f5-4fdd-a1dd-76c0a34cef80, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.538 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[85566ca6-c3a6-4c56-bc55-545d86e34035]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.539 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80 namespace which is not needed anymore
Dec 02 23:53:16 compute-0 nova_compute[187243]: 2025-12-02 23:53:16.563 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:16 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec 02 23:53:16 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 14.265s CPU time.
Dec 02 23:53:16 compute-0 systemd-machined[153518]: Machine qemu-2-instance-00000002 terminated.
Dec 02 23:53:16 compute-0 podman[210655]: 2025-12-02 23:53:16.686766851 +0000 UTC m=+0.037154793 container kill 8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 02 23:53:16 compute-0 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[210430]: [NOTICE]   (210434) : haproxy version is 3.0.5-8e879a5
Dec 02 23:53:16 compute-0 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[210430]: [NOTICE]   (210434) : path to executable is /usr/sbin/haproxy
Dec 02 23:53:16 compute-0 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[210430]: [WARNING]  (210434) : Exiting Master process...
Dec 02 23:53:16 compute-0 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[210430]: [ALERT]    (210434) : Current worker (210436) exited with code 143 (Terminated)
Dec 02 23:53:16 compute-0 neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80[210430]: [WARNING]  (210434) : All workers exited. Exiting... (0)
Dec 02 23:53:16 compute-0 systemd[1]: libpod-8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262.scope: Deactivated successfully.
Dec 02 23:53:16 compute-0 conmon[210430]: conmon 8b720d406783c8070612 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262.scope/container/memory.events
Dec 02 23:53:16 compute-0 nova_compute[187243]: 2025-12-02 23:53:16.719 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:16 compute-0 nova_compute[187243]: 2025-12-02 23:53:16.724 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:16 compute-0 podman[210670]: 2025-12-02 23:53:16.746173392 +0000 UTC m=+0.035507902 container died 8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 02 23:53:16 compute-0 nova_compute[187243]: 2025-12-02 23:53:16.766 187247 INFO nova.virt.libvirt.driver [-] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Instance destroyed successfully.
Dec 02 23:53:16 compute-0 nova_compute[187243]: 2025-12-02 23:53:16.767 187247 DEBUG nova.objects.instance [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lazy-loading 'resources' on Instance uuid 11673f94-0590-4a0b-a344-0dfac27faf87 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:53:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262-userdata-shm.mount: Deactivated successfully.
Dec 02 23:53:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-d93b00f6b466acbf89f8cf3cda79d04ed486c7890edb09137a63493bb62a03b3-merged.mount: Deactivated successfully.
Dec 02 23:53:16 compute-0 podman[210670]: 2025-12-02 23:53:16.78606413 +0000 UTC m=+0.075398610 container cleanup 8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Dec 02 23:53:16 compute-0 systemd[1]: libpod-conmon-8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262.scope: Deactivated successfully.
Dec 02 23:53:16 compute-0 podman[210672]: 2025-12-02 23:53:16.802451328 +0000 UTC m=+0.080097265 container remove 8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.808 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5fe4809c-fe13-411c-88ec-8998c95b3770]: (4, ("Tue Dec  2 11:53:16 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80 (8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262)\n8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262\nTue Dec  2 11:53:16 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80 (8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262)\n8b720d406783c80706121bc7958f0cce22e38ef6bf11efb7ecd59383e8c01262\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.810 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7fdf6347-8237-41b3-b605-2eae698577fe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.810 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2c29168d-89f5-4fdd-a1dd-76c0a34cef80.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.811 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7154dce0-ec03-4b79-9f06-6d329dc11e07]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.812 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c29168d-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:53:16 compute-0 nova_compute[187243]: 2025-12-02 23:53:16.814 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:16 compute-0 kernel: tap2c29168d-80: left promiscuous mode
Dec 02 23:53:16 compute-0 nova_compute[187243]: 2025-12-02 23:53:16.833 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.836 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a00507-eab2-477e-a983-67284f708ae2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.851 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[958e46a3-fd4d-4a28-b8f3-8bf3ea22f16c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.852 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0e91e577-ad53-42ee-a680-c94437d6a27e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.872 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[95b8b8c9-d6c1-40fe-949c-1063894fb725]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363905, 'reachable_time': 19654, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210718, 'error': None, 'target': 'ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.875 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2c29168d-89f5-4fdd-a1dd-76c0a34cef80 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 02 23:53:16 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:16.875 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5874c4-3204-4f35-b5cb-e4922c556079]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:16 compute-0 systemd[1]: run-netns-ovnmeta\x2d2c29168d\x2d89f5\x2d4fdd\x2da1dd\x2d76c0a34cef80.mount: Deactivated successfully.
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.200 187247 DEBUG nova.compute.manager [req-e662c832-1503-4ffc-801c-c64302fc5e77 req-d03623a6-a84c-4884-a1ba-64ccb4fd64e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Received event network-vif-unplugged-13cba066-1d7a-4449-8510-04cbf6aeed5f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.200 187247 DEBUG oslo_concurrency.lockutils [req-e662c832-1503-4ffc-801c-c64302fc5e77 req-d03623a6-a84c-4884-a1ba-64ccb4fd64e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.201 187247 DEBUG oslo_concurrency.lockutils [req-e662c832-1503-4ffc-801c-c64302fc5e77 req-d03623a6-a84c-4884-a1ba-64ccb4fd64e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.201 187247 DEBUG oslo_concurrency.lockutils [req-e662c832-1503-4ffc-801c-c64302fc5e77 req-d03623a6-a84c-4884-a1ba-64ccb4fd64e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.201 187247 DEBUG nova.compute.manager [req-e662c832-1503-4ffc-801c-c64302fc5e77 req-d03623a6-a84c-4884-a1ba-64ccb4fd64e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] No waiting events found dispatching network-vif-unplugged-13cba066-1d7a-4449-8510-04cbf6aeed5f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.202 187247 DEBUG nova.compute.manager [req-e662c832-1503-4ffc-801c-c64302fc5e77 req-d03623a6-a84c-4884-a1ba-64ccb4fd64e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Received event network-vif-unplugged-13cba066-1d7a-4449-8510-04cbf6aeed5f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.275 187247 DEBUG nova.virt.libvirt.vif [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-02T23:52:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1017793270',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1017793270',id=2,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:52:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='059ee4b8b9ab47ffbc539c03339a4112',ramdisk_id='',reservation_id='r-6r0rl3jv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestDataModel-1253061916',owner_user_name='tempest-TestDataModel-1253061916-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:52:31Z,user_data=None,user_id='d032790eea2c4094b69ea4a2576bff68',uuid=11673f94-0590-4a0b-a344-0dfac27faf87,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "address": "fa:16:3e:b7:db:88", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cba066-1d", "ovs_interfaceid": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.276 187247 DEBUG nova.network.os_vif_util [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Converting VIF {"id": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "address": "fa:16:3e:b7:db:88", "network": {"id": "2c29168d-89f5-4fdd-a1dd-76c0a34cef80", "bridge": "br-int", "label": "tempest-TestDataModel-1619373275-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3957003f9ee8492688556ccb0cd5fdd5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13cba066-1d", "ovs_interfaceid": "13cba066-1d7a-4449-8510-04cbf6aeed5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.277 187247 DEBUG nova.network.os_vif_util [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:db:88,bridge_name='br-int',has_traffic_filtering=True,id=13cba066-1d7a-4449-8510-04cbf6aeed5f,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13cba066-1d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.277 187247 DEBUG os_vif [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:db:88,bridge_name='br-int',has_traffic_filtering=True,id=13cba066-1d7a-4449-8510-04cbf6aeed5f,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13cba066-1d') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.280 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.281 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13cba066-1d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.282 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.283 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.285 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.285 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=599dfd74-b74a-4905-b552-f21706daa464) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.286 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.288 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.292 187247 INFO os_vif [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:db:88,bridge_name='br-int',has_traffic_filtering=True,id=13cba066-1d7a-4449-8510-04cbf6aeed5f,network=Network(2c29168d-89f5-4fdd-a1dd-76c0a34cef80),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13cba066-1d')
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.292 187247 INFO nova.virt.libvirt.driver [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Deleting instance files /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87_del
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.294 187247 INFO nova.virt.libvirt.driver [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Deletion of /var/lib/nova/instances/11673f94-0590-4a0b-a344-0dfac27faf87_del complete
Dec 02 23:53:17 compute-0 sshd-session[210627]: Received disconnect from 61.220.235.10 port 41648:11: Bye Bye [preauth]
Dec 02 23:53:17 compute-0 sshd-session[210627]: Disconnected from authenticating user root 61.220.235.10 port 41648 [preauth]
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.807 187247 INFO nova.compute.manager [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Took 1.31 seconds to destroy the instance on the hypervisor.
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.809 187247 DEBUG oslo.service.backend._eventlet.loopingcall [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.809 187247 DEBUG nova.compute.manager [-] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.810 187247 DEBUG nova.network.neutron [-] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 02 23:53:17 compute-0 nova_compute[187243]: 2025-12-02 23:53:17.810 187247 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:53:18 compute-0 nova_compute[187243]: 2025-12-02 23:53:18.057 187247 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:53:18 compute-0 nova_compute[187243]: 2025-12-02 23:53:18.948 187247 DEBUG nova.network.neutron [-] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:53:19 compute-0 podman[210719]: 2025-12-02 23:53:19.099156081 +0000 UTC m=+0.059080734 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:53:19 compute-0 nova_compute[187243]: 2025-12-02 23:53:19.269 187247 DEBUG nova.compute.manager [req-9226598a-4494-490c-a73d-5a74ce6083b3 req-e2ba32fc-1dab-4054-8de3-e94d0bf4be25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Received event network-vif-unplugged-13cba066-1d7a-4449-8510-04cbf6aeed5f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:53:19 compute-0 nova_compute[187243]: 2025-12-02 23:53:19.270 187247 DEBUG oslo_concurrency.lockutils [req-9226598a-4494-490c-a73d-5a74ce6083b3 req-e2ba32fc-1dab-4054-8de3-e94d0bf4be25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:19 compute-0 nova_compute[187243]: 2025-12-02 23:53:19.270 187247 DEBUG oslo_concurrency.lockutils [req-9226598a-4494-490c-a73d-5a74ce6083b3 req-e2ba32fc-1dab-4054-8de3-e94d0bf4be25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:19 compute-0 nova_compute[187243]: 2025-12-02 23:53:19.270 187247 DEBUG oslo_concurrency.lockutils [req-9226598a-4494-490c-a73d-5a74ce6083b3 req-e2ba32fc-1dab-4054-8de3-e94d0bf4be25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:19 compute-0 nova_compute[187243]: 2025-12-02 23:53:19.271 187247 DEBUG nova.compute.manager [req-9226598a-4494-490c-a73d-5a74ce6083b3 req-e2ba32fc-1dab-4054-8de3-e94d0bf4be25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] No waiting events found dispatching network-vif-unplugged-13cba066-1d7a-4449-8510-04cbf6aeed5f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:53:19 compute-0 nova_compute[187243]: 2025-12-02 23:53:19.271 187247 DEBUG nova.compute.manager [req-9226598a-4494-490c-a73d-5a74ce6083b3 req-e2ba32fc-1dab-4054-8de3-e94d0bf4be25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Received event network-vif-unplugged-13cba066-1d7a-4449-8510-04cbf6aeed5f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:53:19 compute-0 nova_compute[187243]: 2025-12-02 23:53:19.271 187247 DEBUG nova.compute.manager [req-9226598a-4494-490c-a73d-5a74ce6083b3 req-e2ba32fc-1dab-4054-8de3-e94d0bf4be25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Received event network-vif-deleted-13cba066-1d7a-4449-8510-04cbf6aeed5f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:53:19 compute-0 nova_compute[187243]: 2025-12-02 23:53:19.455 187247 INFO nova.compute.manager [-] [instance: 11673f94-0590-4a0b-a344-0dfac27faf87] Took 1.65 seconds to deallocate network for instance.
Dec 02 23:53:19 compute-0 nova_compute[187243]: 2025-12-02 23:53:19.981 187247 DEBUG oslo_concurrency.lockutils [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:19 compute-0 nova_compute[187243]: 2025-12-02 23:53:19.981 187247 DEBUG oslo_concurrency.lockutils [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:20 compute-0 nova_compute[187243]: 2025-12-02 23:53:20.030 187247 DEBUG nova.compute.provider_tree [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:53:20 compute-0 nova_compute[187243]: 2025-12-02 23:53:20.538 187247 DEBUG nova.scheduler.client.report [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:53:20 compute-0 nova_compute[187243]: 2025-12-02 23:53:20.544 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:21 compute-0 nova_compute[187243]: 2025-12-02 23:53:21.057 187247 DEBUG oslo_concurrency.lockutils [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.076s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:21 compute-0 nova_compute[187243]: 2025-12-02 23:53:21.093 187247 INFO nova.scheduler.client.report [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Deleted allocations for instance 11673f94-0590-4a0b-a344-0dfac27faf87
Dec 02 23:53:21 compute-0 nova_compute[187243]: 2025-12-02 23:53:21.486 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:22 compute-0 nova_compute[187243]: 2025-12-02 23:53:22.003 187247 WARNING nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.
Dec 02 23:53:22 compute-0 nova_compute[187243]: 2025-12-02 23:53:22.003 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Triggering sync for uuid 11673f94-0590-4a0b-a344-0dfac27faf87 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Dec 02 23:53:22 compute-0 nova_compute[187243]: 2025-12-02 23:53:22.004 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "11673f94-0590-4a0b-a344-0dfac27faf87" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:22 compute-0 nova_compute[187243]: 2025-12-02 23:53:22.138 187247 DEBUG oslo_concurrency.lockutils [None req-e013924e-07e2-4907-ada5-eb1f2b770255 d032790eea2c4094b69ea4a2576bff68 059ee4b8b9ab47ffbc539c03339a4112 - - default default] Lock "11673f94-0590-4a0b-a344-0dfac27faf87" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.320s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:22 compute-0 nova_compute[187243]: 2025-12-02 23:53:22.139 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "11673f94-0590-4a0b-a344-0dfac27faf87" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.135s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:22 compute-0 nova_compute[187243]: 2025-12-02 23:53:22.287 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:22 compute-0 nova_compute[187243]: 2025-12-02 23:53:22.653 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "11673f94-0590-4a0b-a344-0dfac27faf87" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.514s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:23 compute-0 podman[210744]: 2025-12-02 23:53:23.15948297 +0000 UTC m=+0.103362059 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 02 23:53:23 compute-0 podman[210745]: 2025-12-02 23:53:23.173484259 +0000 UTC m=+0.106114026 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 02 23:53:25 compute-0 nova_compute[187243]: 2025-12-02 23:53:25.544 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:25 compute-0 nova_compute[187243]: 2025-12-02 23:53:25.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:25 compute-0 nova_compute[187243]: 2025-12-02 23:53:25.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:25 compute-0 nova_compute[187243]: 2025-12-02 23:53:25.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:53:26 compute-0 nova_compute[187243]: 2025-12-02 23:53:26.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:27 compute-0 nova_compute[187243]: 2025-12-02 23:53:27.289 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:27 compute-0 nova_compute[187243]: 2025-12-02 23:53:27.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:28 compute-0 nova_compute[187243]: 2025-12-02 23:53:28.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:28 compute-0 nova_compute[187243]: 2025-12-02 23:53:28.738 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:28 compute-0 sshd-session[210786]: Received disconnect from 102.210.148.92 port 34336:11: Bye Bye [preauth]
Dec 02 23:53:28 compute-0 sshd-session[210786]: Disconnected from authenticating user root 102.210.148.92 port 34336 [preauth]
Dec 02 23:53:29 compute-0 nova_compute[187243]: 2025-12-02 23:53:29.106 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:29 compute-0 nova_compute[187243]: 2025-12-02 23:53:29.107 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:29 compute-0 nova_compute[187243]: 2025-12-02 23:53:29.107 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:29 compute-0 nova_compute[187243]: 2025-12-02 23:53:29.107 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:53:29 compute-0 nova_compute[187243]: 2025-12-02 23:53:29.319 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:53:29 compute-0 nova_compute[187243]: 2025-12-02 23:53:29.321 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:53:29 compute-0 nova_compute[187243]: 2025-12-02 23:53:29.367 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:53:29 compute-0 nova_compute[187243]: 2025-12-02 23:53:29.368 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5811MB free_disk=73.16926574707031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:53:29 compute-0 nova_compute[187243]: 2025-12-02 23:53:29.368 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:53:29 compute-0 nova_compute[187243]: 2025-12-02 23:53:29.369 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:53:29 compute-0 sshd[128750]: Timeout before authentication for connection from 101.47.140.127 to 38.102.83.77, pid = 209973
Dec 02 23:53:29 compute-0 podman[197600]: time="2025-12-02T23:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:53:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:53:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Dec 02 23:53:30 compute-0 nova_compute[187243]: 2025-12-02 23:53:30.490 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:53:30 compute-0 nova_compute[187243]: 2025-12-02 23:53:30.491 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:53:29 up  1:01,  0 user,  load average: 0.20, 0.33, 0.46\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:53:30 compute-0 nova_compute[187243]: 2025-12-02 23:53:30.545 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:30 compute-0 nova_compute[187243]: 2025-12-02 23:53:30.568 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:53:30 compute-0 sshd-session[210788]: Invalid user scan from 45.78.219.213 port 34728
Dec 02 23:53:31 compute-0 nova_compute[187243]: 2025-12-02 23:53:31.077 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:53:31 compute-0 sshd-session[210788]: Received disconnect from 45.78.219.213 port 34728:11: Bye Bye [preauth]
Dec 02 23:53:31 compute-0 sshd-session[210788]: Disconnected from invalid user scan 45.78.219.213 port 34728 [preauth]
Dec 02 23:53:31 compute-0 openstack_network_exporter[199746]: ERROR   23:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:53:31 compute-0 openstack_network_exporter[199746]: ERROR   23:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:53:31 compute-0 openstack_network_exporter[199746]: ERROR   23:53:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:53:31 compute-0 openstack_network_exporter[199746]: ERROR   23:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:53:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:53:31 compute-0 openstack_network_exporter[199746]: ERROR   23:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:53:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:53:31 compute-0 nova_compute[187243]: 2025-12-02 23:53:31.586 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:53:31 compute-0 nova_compute[187243]: 2025-12-02 23:53:31.587 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.218s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:53:32 compute-0 nova_compute[187243]: 2025-12-02 23:53:32.292 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:32 compute-0 nova_compute[187243]: 2025-12-02 23:53:32.586 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:32 compute-0 nova_compute[187243]: 2025-12-02 23:53:32.587 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:32 compute-0 nova_compute[187243]: 2025-12-02 23:53:32.587 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:53:35 compute-0 podman[210792]: 2025-12-02 23:53:35.121384921 +0000 UTC m=+0.070045980 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vcs-type=git, release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 02 23:53:35 compute-0 nova_compute[187243]: 2025-12-02 23:53:35.548 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:37 compute-0 nova_compute[187243]: 2025-12-02 23:53:37.293 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:40 compute-0 podman[210814]: 2025-12-02 23:53:40.180063537 +0000 UTC m=+0.123296793 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4)
Dec 02 23:53:40 compute-0 nova_compute[187243]: 2025-12-02 23:53:40.549 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:42 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:42.089 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:0f:8c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ba5fccf757b4adaa08907c11ae17f57', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9ee451cb-cc6e-44d6-98fb-cdfa0566e521) old=Port_Binding(mac=['fa:16:3e:ab:0f:8c'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ba5fccf757b4adaa08907c11ae17f57', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:53:42 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:42.090 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9ee451cb-cc6e-44d6-98fb-cdfa0566e521 in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a updated
Dec 02 23:53:42 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:42.092 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec494140-a5f4-4327-8807-d7248b1cdc9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:53:42 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:42.093 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a58195a9-1e19-4758-84ce-fed071432c87]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:42 compute-0 nova_compute[187243]: 2025-12-02 23:53:42.295 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:44 compute-0 sshd[128750]: Timeout before authentication for connection from 45.78.218.154 to 38.102.83.77, pid = 210118
Dec 02 23:53:45 compute-0 nova_compute[187243]: 2025-12-02 23:53:45.551 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:46 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:46.430 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:53:46 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:46.430 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:53:46 compute-0 nova_compute[187243]: 2025-12-02 23:53:46.431 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:46 compute-0 sshd-session[210835]: Received disconnect from 23.95.37.90 port 37386:11: Bye Bye [preauth]
Dec 02 23:53:46 compute-0 sshd-session[210835]: Disconnected from authenticating user root 23.95.37.90 port 37386 [preauth]
Dec 02 23:53:47 compute-0 nova_compute[187243]: 2025-12-02 23:53:47.297 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:50 compute-0 podman[210837]: 2025-12-02 23:53:50.127441004 +0000 UTC m=+0.072348127 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:53:50 compute-0 nova_compute[187243]: 2025-12-02 23:53:50.553 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:52.107 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:ed:a4 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cb0fa041-521f-435f-82ea-d7eab4f5ab40', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cb0fa041-521f-435f-82ea-d7eab4f5ab40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c8e6c60-6229-4bb7-a7c1-8d84b1a1b4af, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=46cd7a9e-86e4-4aaf-b5c5-07e80f59a989) old=Port_Binding(mac=['fa:16:3e:35:ed:a4'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-cb0fa041-521f-435f-82ea-d7eab4f5ab40', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cb0fa041-521f-435f-82ea-d7eab4f5ab40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:53:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:52.108 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 46cd7a9e-86e4-4aaf-b5c5-07e80f59a989 in datapath cb0fa041-521f-435f-82ea-d7eab4f5ab40 updated
Dec 02 23:53:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:52.110 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cb0fa041-521f-435f-82ea-d7eab4f5ab40, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:53:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:52.112 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f40ea5-f954-4be6-b3f6-c6d9f8e99970]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:53:52 compute-0 nova_compute[187243]: 2025-12-02 23:53:52.299 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:54 compute-0 podman[210861]: 2025-12-02 23:53:54.102326631 +0000 UTC m=+0.064598539 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 02 23:53:54 compute-0 podman[210862]: 2025-12-02 23:53:54.152265943 +0000 UTC m=+0.101463193 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Dec 02 23:53:55 compute-0 nova_compute[187243]: 2025-12-02 23:53:55.554 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:56 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:53:56.431 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:53:57 compute-0 sshd-session[210906]: Invalid user ghost from 49.247.36.49 port 31495
Dec 02 23:53:57 compute-0 sshd-session[210906]: Received disconnect from 49.247.36.49 port 31495:11: Bye Bye [preauth]
Dec 02 23:53:57 compute-0 sshd-session[210906]: Disconnected from invalid user ghost 49.247.36.49 port 31495 [preauth]
Dec 02 23:53:57 compute-0 nova_compute[187243]: 2025-12-02 23:53:57.350 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:53:57 compute-0 sshd[128750]: drop connection #0 from [101.47.140.127]:60632 on [38.102.83.77]:22 penalty: exceeded LoginGraceTime
Dec 02 23:53:59 compute-0 podman[197600]: time="2025-12-02T23:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:53:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:53:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Dec 02 23:54:00 compute-0 nova_compute[187243]: 2025-12-02 23:54:00.556 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:00.676 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:00.676 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:00.676 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:01 compute-0 openstack_network_exporter[199746]: ERROR   23:54:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:54:01 compute-0 openstack_network_exporter[199746]: ERROR   23:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:54:01 compute-0 openstack_network_exporter[199746]: ERROR   23:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:54:01 compute-0 openstack_network_exporter[199746]: ERROR   23:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:54:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:54:01 compute-0 openstack_network_exporter[199746]: ERROR   23:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:54:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:54:02 compute-0 nova_compute[187243]: 2025-12-02 23:54:02.390 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:03 compute-0 ovn_controller[95488]: 2025-12-02T23:54:03Z|00057|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 02 23:54:05 compute-0 sshd[128750]: drop connection #0 from [45.78.218.154]:33618 on [38.102.83.77]:22 penalty: exceeded LoginGraceTime
Dec 02 23:54:05 compute-0 nova_compute[187243]: 2025-12-02 23:54:05.560 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:06 compute-0 podman[210909]: 2025-12-02 23:54:06.124821302 +0000 UTC m=+0.082332589 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 23:54:07 compute-0 nova_compute[187243]: 2025-12-02 23:54:07.441 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:10 compute-0 nova_compute[187243]: 2025-12-02 23:54:10.561 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:11 compute-0 podman[210931]: 2025-12-02 23:54:11.116426171 +0000 UTC m=+0.065125352 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd)
Dec 02 23:54:12 compute-0 nova_compute[187243]: 2025-12-02 23:54:12.444 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:15 compute-0 nova_compute[187243]: 2025-12-02 23:54:15.547 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:15 compute-0 nova_compute[187243]: 2025-12-02 23:54:15.548 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:15 compute-0 nova_compute[187243]: 2025-12-02 23:54:15.562 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:16 compute-0 nova_compute[187243]: 2025-12-02 23:54:16.053 187247 DEBUG nova.compute.manager [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 02 23:54:16 compute-0 nova_compute[187243]: 2025-12-02 23:54:16.687 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:16 compute-0 nova_compute[187243]: 2025-12-02 23:54:16.688 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:16 compute-0 nova_compute[187243]: 2025-12-02 23:54:16.698 187247 DEBUG nova.virt.hardware [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 02 23:54:16 compute-0 nova_compute[187243]: 2025-12-02 23:54:16.698 187247 INFO nova.compute.claims [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Claim successful on node compute-0.ctlplane.example.com
Dec 02 23:54:17 compute-0 nova_compute[187243]: 2025-12-02 23:54:17.497 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:17 compute-0 nova_compute[187243]: 2025-12-02 23:54:17.776 187247 DEBUG nova.compute.provider_tree [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:54:18 compute-0 nova_compute[187243]: 2025-12-02 23:54:18.284 187247 DEBUG nova.scheduler.client.report [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:54:18 compute-0 nova_compute[187243]: 2025-12-02 23:54:18.804 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:18 compute-0 nova_compute[187243]: 2025-12-02 23:54:18.804 187247 DEBUG nova.compute.manager [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 02 23:54:19 compute-0 nova_compute[187243]: 2025-12-02 23:54:19.315 187247 DEBUG nova.compute.manager [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 02 23:54:19 compute-0 nova_compute[187243]: 2025-12-02 23:54:19.316 187247 DEBUG nova.network.neutron [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 02 23:54:19 compute-0 nova_compute[187243]: 2025-12-02 23:54:19.317 187247 WARNING neutronclient.v2_0.client [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:54:19 compute-0 nova_compute[187243]: 2025-12-02 23:54:19.318 187247 WARNING neutronclient.v2_0.client [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:54:19 compute-0 nova_compute[187243]: 2025-12-02 23:54:19.827 187247 INFO nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 02 23:54:20 compute-0 nova_compute[187243]: 2025-12-02 23:54:20.340 187247 DEBUG nova.compute.manager [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 02 23:54:20 compute-0 nova_compute[187243]: 2025-12-02 23:54:20.567 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:21 compute-0 podman[210952]: 2025-12-02 23:54:21.100936608 +0000 UTC m=+0.061916763 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.365 187247 DEBUG nova.compute.manager [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.367 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.367 187247 INFO nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Creating image(s)
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.367 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.368 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.368 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.369 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.373 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.375 187247 DEBUG oslo_concurrency.processutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.428 187247 DEBUG oslo_concurrency.processutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.429 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.430 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.431 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.437 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.438 187247 DEBUG oslo_concurrency.processutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.465 187247 DEBUG nova.network.neutron [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Successfully created port: fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.490 187247 DEBUG oslo_concurrency.processutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.491 187247 DEBUG oslo_concurrency.processutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.522 187247 DEBUG oslo_concurrency.processutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.523 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.092s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.523 187247 DEBUG oslo_concurrency.processutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:21 compute-0 sshd-session[210976]: Invalid user admin1 from 20.123.120.169 port 56870
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.605 187247 DEBUG oslo_concurrency.processutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.605 187247 DEBUG nova.virt.disk.api [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Checking if we can resize image /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.606 187247 DEBUG oslo_concurrency.processutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:21 compute-0 sshd-session[210976]: Received disconnect from 20.123.120.169 port 56870:11: Bye Bye [preauth]
Dec 02 23:54:21 compute-0 sshd-session[210976]: Disconnected from invalid user admin1 20.123.120.169 port 56870 [preauth]
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.688 187247 DEBUG oslo_concurrency.processutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.689 187247 DEBUG nova.virt.disk.api [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Cannot resize image /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.690 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.691 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Ensure instance console log exists: /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.692 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.692 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:21 compute-0 nova_compute[187243]: 2025-12-02 23:54:21.693 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:22 compute-0 nova_compute[187243]: 2025-12-02 23:54:22.073 187247 DEBUG nova.network.neutron [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Successfully updated port: fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 02 23:54:22 compute-0 nova_compute[187243]: 2025-12-02 23:54:22.130 187247 DEBUG nova.compute.manager [req-85cd02d2-1347-42d2-ace3-31ea07a2e1ba req-e8218e3b-ada8-4666-b67c-39d02899cd3a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-changed-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:54:22 compute-0 nova_compute[187243]: 2025-12-02 23:54:22.130 187247 DEBUG nova.compute.manager [req-85cd02d2-1347-42d2-ace3-31ea07a2e1ba req-e8218e3b-ada8-4666-b67c-39d02899cd3a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Refreshing instance network info cache due to event network-changed-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:54:22 compute-0 nova_compute[187243]: 2025-12-02 23:54:22.131 187247 DEBUG oslo_concurrency.lockutils [req-85cd02d2-1347-42d2-ace3-31ea07a2e1ba req-e8218e3b-ada8-4666-b67c-39d02899cd3a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:54:22 compute-0 nova_compute[187243]: 2025-12-02 23:54:22.131 187247 DEBUG oslo_concurrency.lockutils [req-85cd02d2-1347-42d2-ace3-31ea07a2e1ba req-e8218e3b-ada8-4666-b67c-39d02899cd3a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:54:22 compute-0 nova_compute[187243]: 2025-12-02 23:54:22.131 187247 DEBUG nova.network.neutron [req-85cd02d2-1347-42d2-ace3-31ea07a2e1ba req-e8218e3b-ada8-4666-b67c-39d02899cd3a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Refreshing network info cache for port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:54:22 compute-0 nova_compute[187243]: 2025-12-02 23:54:22.500 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:22 compute-0 nova_compute[187243]: 2025-12-02 23:54:22.579 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:54:22 compute-0 nova_compute[187243]: 2025-12-02 23:54:22.637 187247 WARNING neutronclient.v2_0.client [req-85cd02d2-1347-42d2-ace3-31ea07a2e1ba req-e8218e3b-ada8-4666-b67c-39d02899cd3a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:54:22 compute-0 sshd-session[210993]: userauth_pubkey: signature algorithm ssh-rsa not in PubkeyAcceptedAlgorithms [preauth]
Dec 02 23:54:23 compute-0 nova_compute[187243]: 2025-12-02 23:54:23.089 187247 DEBUG nova.network.neutron [req-85cd02d2-1347-42d2-ace3-31ea07a2e1ba req-e8218e3b-ada8-4666-b67c-39d02899cd3a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:54:23 compute-0 nova_compute[187243]: 2025-12-02 23:54:23.242 187247 DEBUG nova.network.neutron [req-85cd02d2-1347-42d2-ace3-31ea07a2e1ba req-e8218e3b-ada8-4666-b67c-39d02899cd3a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:54:23 compute-0 nova_compute[187243]: 2025-12-02 23:54:23.753 187247 DEBUG oslo_concurrency.lockutils [req-85cd02d2-1347-42d2-ace3-31ea07a2e1ba req-e8218e3b-ada8-4666-b67c-39d02899cd3a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:54:23 compute-0 nova_compute[187243]: 2025-12-02 23:54:23.754 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquired lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:54:23 compute-0 nova_compute[187243]: 2025-12-02 23:54:23.754 187247 DEBUG nova.network.neutron [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:54:25 compute-0 nova_compute[187243]: 2025-12-02 23:54:25.088 187247 DEBUG nova.network.neutron [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:54:25 compute-0 podman[210996]: 2025-12-02 23:54:25.138294901 +0000 UTC m=+0.088916839 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Dec 02 23:54:25 compute-0 podman[210997]: 2025-12-02 23:54:25.169445027 +0000 UTC m=+0.115520854 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:54:25 compute-0 nova_compute[187243]: 2025-12-02 23:54:25.567 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.054 187247 WARNING neutronclient.v2_0.client [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.249 187247 DEBUG nova.network.neutron [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updating instance_info_cache with network_info: [{"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.761 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Releasing lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.762 187247 DEBUG nova.compute.manager [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Instance network_info: |[{"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.767 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Start _get_guest_xml network_info=[{"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.773 187247 WARNING nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.775 187247 DEBUG nova.virt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-607610768', uuid='d8ccd45c-e570-4b75-b836-a93e2de1818b'), owner=OwnerMeta(userid='d31b8a74cb3c48f3b147970eec936bca', username='tempest-TestExecuteActionsViaActuator-1889160444-project-admin', projectid='5f2368878ee9447ea8fcef9927711e2d', projectname='tempest-TestExecuteActionsViaActuator-1889160444'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764719666.775649) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.781 187247 DEBUG nova.virt.libvirt.host [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.782 187247 DEBUG nova.virt.libvirt.host [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.786 187247 DEBUG nova.virt.libvirt.host [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.787 187247 DEBUG nova.virt.libvirt.host [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.789 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.790 187247 DEBUG nova.virt.hardware [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.790 187247 DEBUG nova.virt.hardware [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.791 187247 DEBUG nova.virt.hardware [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.791 187247 DEBUG nova.virt.hardware [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.791 187247 DEBUG nova.virt.hardware [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.792 187247 DEBUG nova.virt.hardware [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.792 187247 DEBUG nova.virt.hardware [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.793 187247 DEBUG nova.virt.hardware [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.794 187247 DEBUG nova.virt.hardware [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.795 187247 DEBUG nova.virt.hardware [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.795 187247 DEBUG nova.virt.hardware [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.801 187247 DEBUG nova.virt.libvirt.vif [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-02T23:54:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-607610768',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-607610768',id=4,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-ziravjgf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:54:20Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=d8ccd45c-e570-4b75-b836-a93e2de1818b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.802 187247 DEBUG nova.network.os_vif_util [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.803 187247 DEBUG nova.network.os_vif_util [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:54:26 compute-0 nova_compute[187243]: 2025-12-02 23:54:26.805 187247 DEBUG nova.objects.instance [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lazy-loading 'pci_devices' on Instance uuid d8ccd45c-e570-4b75-b836-a93e2de1818b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.313 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] End _get_guest_xml xml=<domain type="kvm">
Dec 02 23:54:27 compute-0 nova_compute[187243]:   <uuid>d8ccd45c-e570-4b75-b836-a93e2de1818b</uuid>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   <name>instance-00000004</name>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   <metadata>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-607610768</nova:name>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-02 23:54:26</nova:creationTime>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 02 23:54:27 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:properties>
Dec 02 23:54:27 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         </nova:properties>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       </nova:image>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <nova:owner>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:user uuid="d31b8a74cb3c48f3b147970eec936bca">tempest-TestExecuteActionsViaActuator-1889160444-project-admin</nova:user>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:project uuid="5f2368878ee9447ea8fcef9927711e2d">tempest-TestExecuteActionsViaActuator-1889160444</nova:project>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       </nova:owner>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <nova:ports>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         <nova:port uuid="fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe">
Dec 02 23:54:27 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:         </nova:port>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       </nova:ports>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     </nova:instance>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   </metadata>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <system>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <entry name="serial">d8ccd45c-e570-4b75-b836-a93e2de1818b</entry>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <entry name="uuid">d8ccd45c-e570-4b75-b836-a93e2de1818b</entry>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     </system>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   </sysinfo>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   <os>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   </os>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   <features>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <acpi/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <apic/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   </features>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   </clock>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   </cpu>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   <devices>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.config"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:f8:84:51"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <target dev="tapfbb4ca60-8a"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     </interface>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/console.log" append="off"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     </serial>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <video>
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     </video>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     </rng>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:54:27 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 02 23:54:27 compute-0 nova_compute[187243]:     </memballoon>
Dec 02 23:54:27 compute-0 nova_compute[187243]:   </devices>
Dec 02 23:54:27 compute-0 nova_compute[187243]: </domain>
Dec 02 23:54:27 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.314 187247 DEBUG nova.compute.manager [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Preparing to wait for external event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.314 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.314 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.315 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.315 187247 DEBUG nova.virt.libvirt.vif [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-02T23:54:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-607610768',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-607610768',id=4,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-ziravjgf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:54:20Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=d8ccd45c-e570-4b75-b836-a93e2de1818b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.316 187247 DEBUG nova.network.os_vif_util [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.316 187247 DEBUG nova.network.os_vif_util [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.317 187247 DEBUG os_vif [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.317 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.317 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.318 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.318 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.319 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8fb1aca8-1952-5444-86e5-0e27fed9a58c', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.320 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.322 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.325 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.325 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbb4ca60-8a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.326 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapfbb4ca60-8a, col_values=(('qos', UUID('46908d5b-b2fd-4426-9859-ca4f295ab544')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.326 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapfbb4ca60-8a, col_values=(('external_ids', {'iface-id': 'fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:84:51', 'vm-uuid': 'd8ccd45c-e570-4b75-b836-a93e2de1818b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.327 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:27 compute-0 NetworkManager[55671]: <info>  [1764719667.3282] manager: (tapfbb4ca60-8a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.329 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.335 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.336 187247 INFO os_vif [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a')
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:27 compute-0 nova_compute[187243]: 2025-12-02 23:54:27.593 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:54:28 compute-0 nova_compute[187243]: 2025-12-02 23:54:28.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:28 compute-0 nova_compute[187243]: 2025-12-02 23:54:28.881 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:54:28 compute-0 nova_compute[187243]: 2025-12-02 23:54:28.882 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:54:28 compute-0 nova_compute[187243]: 2025-12-02 23:54:28.883 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No VIF found with MAC fa:16:3e:f8:84:51, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 02 23:54:28 compute-0 nova_compute[187243]: 2025-12-02 23:54:28.884 187247 INFO nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Using config drive
Dec 02 23:54:29 compute-0 nova_compute[187243]: 2025-12-02 23:54:29.397 187247 WARNING neutronclient.v2_0.client [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:54:29 compute-0 nova_compute[187243]: 2025-12-02 23:54:29.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:29 compute-0 podman[197600]: time="2025-12-02T23:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:54:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:54:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2586 "" "Go-http-client/1.1"
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.108 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.109 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.109 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.110 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.228 187247 INFO nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Creating config drive at /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.config
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.238 187247 DEBUG oslo_concurrency.processutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp2who4axz execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.385 187247 DEBUG oslo_concurrency.processutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp2who4axz" returned: 0 in 0.147s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:30 compute-0 kernel: tapfbb4ca60-8a: entered promiscuous mode
Dec 02 23:54:30 compute-0 NetworkManager[55671]: <info>  [1764719670.4710] manager: (tapfbb4ca60-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Dec 02 23:54:30 compute-0 ovn_controller[95488]: 2025-12-02T23:54:30Z|00058|binding|INFO|Claiming lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for this chassis.
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.473 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:30 compute-0 ovn_controller[95488]: 2025-12-02T23:54:30Z|00059|binding|INFO|fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe: Claiming fa:16:3e:f8:84:51 10.100.0.10
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.486 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.496 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:84:51 10.100.0.10'], port_security=['fa:16:3e:f8:84:51 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd8ccd45c-e570-4b75-b836-a93e2de1818b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.497 104379 INFO neutron.agent.ovn.metadata.agent [-] Port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a bound to our chassis
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.499 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:54:30 compute-0 systemd-udevd[211062]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:54:30 compute-0 systemd-machined[153518]: New machine qemu-3-instance-00000004.
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.522 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ad29c1ad-3632-48b0-9c24-ae4916a0eb36]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.523 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapec494140-a1 in ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.525 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapec494140-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.525 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[49826a01-50b2-498d-ab40-ec73e0980e26]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.527 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[20ed1d2f-562e-457b-a132-f1ab827d3c32]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.542 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[a33047f3-d74e-41a0-9520-6eb4d6c98270]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 NetworkManager[55671]: <info>  [1764719670.5448] device (tapfbb4ca60-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:54:30 compute-0 NetworkManager[55671]: <info>  [1764719670.5463] device (tapfbb4ca60-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 02 23:54:30 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.568 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[424fe247-50ea-444f-a977-3c41b64ba360]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_controller[95488]: 2025-12-02T23:54:30Z|00060|binding|INFO|Setting lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe ovn-installed in OVS
Dec 02 23:54:30 compute-0 ovn_controller[95488]: 2025-12-02T23:54:30Z|00061|binding|INFO|Setting lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe up in Southbound
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.569 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.571 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.612 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[d2dfbd08-6ec7-4960-aa89-ce8a127b9366]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.616 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf2516e-3057-4ed4-a448-8c340cfb6365]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 NetworkManager[55671]: <info>  [1764719670.6181] manager: (tapec494140-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/30)
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.660 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[c4be7ef3-05fb-49bb-8df7-b3b7bcbb305a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.664 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[d25c9fec-0c9e-4e4c-a144-91d899592e05]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 NetworkManager[55671]: <info>  [1764719670.7048] device (tapec494140-a0): carrier: link connected
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.711 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[00be8b5c-cafa-47cc-b999-04ee7215eafa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.733 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[297947e3-2783-480a-bd33-daa5fed1dedf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376134, 'reachable_time': 39106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211094, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.751 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[be6da856-c561-4b3f-8dc5-9dbc0271f0ac]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feab:f8c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376134, 'tstamp': 376134}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211095, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.768 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9535d10b-d085-432d-830a-a72e3c27511f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376134, 'reachable_time': 39106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211096, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.803 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2c7a8256-c256-473a-8684-c82f5fc578ff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.872 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[789d39e3-f46b-47c4-9712-98c27000ad6f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.873 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.873 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.874 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.876 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:30 compute-0 kernel: tapec494140-a0: entered promiscuous mode
Dec 02 23:54:30 compute-0 NetworkManager[55671]: <info>  [1764719670.8793] manager: (tapec494140-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.880 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.880 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:30 compute-0 ovn_controller[95488]: 2025-12-02T23:54:30Z|00062|binding|INFO|Releasing lport 9ee451cb-cc6e-44d6-98fb-cdfa0566e521 from this chassis (sb_readonly=0)
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.883 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.885 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.886 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f0d4e7d1-7ab7-48c8-b856-aab9c1e45129]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.887 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.887 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.887 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for ec494140-a5f4-4327-8807-d7248b1cdc9a disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.887 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.888 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b5dd804a-e26a-43c4-b150-928340699acb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.888 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.888 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b5bfa9ce-caec-4b38-82da-5b73ec78a0e9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.889 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: global
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: defaults
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     log global
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 02 23:54:30 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:54:30.889 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'env', 'PROCESS_TAG=haproxy-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ec494140-a5f4-4327-8807-d7248b1cdc9a.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 02 23:54:30 compute-0 nova_compute[187243]: 2025-12-02 23:54:30.898 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.152 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.230 187247 DEBUG nova.compute.manager [req-6777b086-707f-4630-a772-fe46854043a8 req-7ca5d923-da09-462e-8ef7-9fe853bd8438 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.231 187247 DEBUG oslo_concurrency.lockutils [req-6777b086-707f-4630-a772-fe46854043a8 req-7ca5d923-da09-462e-8ef7-9fe853bd8438 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.232 187247 DEBUG oslo_concurrency.lockutils [req-6777b086-707f-4630-a772-fe46854043a8 req-7ca5d923-da09-462e-8ef7-9fe853bd8438 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.232 187247 DEBUG oslo_concurrency.lockutils [req-6777b086-707f-4630-a772-fe46854043a8 req-7ca5d923-da09-462e-8ef7-9fe853bd8438 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.232 187247 DEBUG nova.compute.manager [req-6777b086-707f-4630-a772-fe46854043a8 req-7ca5d923-da09-462e-8ef7-9fe853bd8438 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Processing event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.268 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json" returned: 0 in 0.116s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.269 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.327 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.338 187247 DEBUG nova.compute.manager [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 02 23:54:31 compute-0 podman[211135]: 2025-12-02 23:54:31.340525166 +0000 UTC m=+0.052630948 container create 3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.344 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.350 187247 INFO nova.virt.libvirt.driver [-] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Instance spawned successfully.
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.351 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 02 23:54:31 compute-0 systemd[1]: Started libpod-conmon-3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd.scope.
Dec 02 23:54:31 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:54:31 compute-0 podman[211135]: 2025-12-02 23:54:31.309791041 +0000 UTC m=+0.021896833 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 02 23:54:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1007859202f4275166c5b416d230884c1317792f1758f14b0c61b8214b0ba3d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 23:54:31 compute-0 openstack_network_exporter[199746]: ERROR   23:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:54:31 compute-0 openstack_network_exporter[199746]: ERROR   23:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:54:31 compute-0 openstack_network_exporter[199746]: ERROR   23:54:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:54:31 compute-0 openstack_network_exporter[199746]: ERROR   23:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:54:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:54:31 compute-0 openstack_network_exporter[199746]: ERROR   23:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:54:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:54:31 compute-0 podman[211135]: 2025-12-02 23:54:31.467302742 +0000 UTC m=+0.179408544 container init 3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Dec 02 23:54:31 compute-0 podman[211135]: 2025-12-02 23:54:31.474118837 +0000 UTC m=+0.186224609 container start 3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 02 23:54:31 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211154]: [NOTICE]   (211158) : New worker (211160) forked
Dec 02 23:54:31 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211154]: [NOTICE]   (211158) : Loading success.
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.528 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.530 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.549 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.551 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5828MB free_disk=73.16587448120117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.551 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.552 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.862 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.863 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.863 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.864 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.864 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:54:31 compute-0 nova_compute[187243]: 2025-12-02 23:54:31.865 187247 DEBUG nova.virt.libvirt.driver [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:54:32 compute-0 nova_compute[187243]: 2025-12-02 23:54:32.328 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:32 compute-0 sshd-session[210993]: Connection closed by authenticating user root 139.19.117.130 port 41110 [preauth]
Dec 02 23:54:32 compute-0 nova_compute[187243]: 2025-12-02 23:54:32.378 187247 INFO nova.compute.manager [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Took 11.01 seconds to spawn the instance on the hypervisor.
Dec 02 23:54:32 compute-0 nova_compute[187243]: 2025-12-02 23:54:32.379 187247 DEBUG nova.compute.manager [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 02 23:54:32 compute-0 nova_compute[187243]: 2025-12-02 23:54:32.691 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance d8ccd45c-e570-4b75-b836-a93e2de1818b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:54:32 compute-0 nova_compute[187243]: 2025-12-02 23:54:32.693 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:54:32 compute-0 nova_compute[187243]: 2025-12-02 23:54:32.693 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:54:31 up  1:02,  0 user,  load average: 0.40, 0.36, 0.46\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_5f2368878ee9447ea8fcef9927711e2d': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:54:32 compute-0 nova_compute[187243]: 2025-12-02 23:54:32.735 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:54:32 compute-0 nova_compute[187243]: 2025-12-02 23:54:32.976 187247 INFO nova.compute.manager [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Took 16.41 seconds to build instance.
Dec 02 23:54:33 compute-0 nova_compute[187243]: 2025-12-02 23:54:33.246 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:54:33 compute-0 nova_compute[187243]: 2025-12-02 23:54:33.326 187247 DEBUG nova.compute.manager [req-0409b807-60ca-4301-af0e-ebfea4b0679a req-b82d1770-e598-45a9-88a2-58f1cab39963 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:54:33 compute-0 nova_compute[187243]: 2025-12-02 23:54:33.326 187247 DEBUG oslo_concurrency.lockutils [req-0409b807-60ca-4301-af0e-ebfea4b0679a req-b82d1770-e598-45a9-88a2-58f1cab39963 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:54:33 compute-0 nova_compute[187243]: 2025-12-02 23:54:33.326 187247 DEBUG oslo_concurrency.lockutils [req-0409b807-60ca-4301-af0e-ebfea4b0679a req-b82d1770-e598-45a9-88a2-58f1cab39963 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:54:33 compute-0 nova_compute[187243]: 2025-12-02 23:54:33.326 187247 DEBUG oslo_concurrency.lockutils [req-0409b807-60ca-4301-af0e-ebfea4b0679a req-b82d1770-e598-45a9-88a2-58f1cab39963 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:33 compute-0 nova_compute[187243]: 2025-12-02 23:54:33.326 187247 DEBUG nova.compute.manager [req-0409b807-60ca-4301-af0e-ebfea4b0679a req-b82d1770-e598-45a9-88a2-58f1cab39963 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:54:33 compute-0 nova_compute[187243]: 2025-12-02 23:54:33.327 187247 WARNING nova.compute.manager [req-0409b807-60ca-4301-af0e-ebfea4b0679a req-b82d1770-e598-45a9-88a2-58f1cab39963 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received unexpected event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with vm_state active and task_state None.
Dec 02 23:54:33 compute-0 nova_compute[187243]: 2025-12-02 23:54:33.483 187247 DEBUG oslo_concurrency.lockutils [None req-33078200-e329-4e66-b1ec-d8b79621c52f d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.936s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:33 compute-0 nova_compute[187243]: 2025-12-02 23:54:33.754 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:54:33 compute-0 nova_compute[187243]: 2025-12-02 23:54:33.754 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.203s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:54:35 compute-0 nova_compute[187243]: 2025-12-02 23:54:35.571 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:35 compute-0 nova_compute[187243]: 2025-12-02 23:54:35.754 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:35 compute-0 nova_compute[187243]: 2025-12-02 23:54:35.755 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:36 compute-0 sshd-session[211170]: Invalid user sales1 from 45.78.219.95 port 52682
Dec 02 23:54:36 compute-0 nova_compute[187243]: 2025-12-02 23:54:36.265 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:36 compute-0 nova_compute[187243]: 2025-12-02 23:54:36.265 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:54:37 compute-0 podman[211172]: 2025-12-02 23:54:37.138462052 +0000 UTC m=+0.078175768 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=)
Dec 02 23:54:37 compute-0 nova_compute[187243]: 2025-12-02 23:54:37.332 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:37 compute-0 sshd-session[211170]: Received disconnect from 45.78.219.95 port 52682:11: Bye Bye [preauth]
Dec 02 23:54:37 compute-0 sshd-session[211170]: Disconnected from invalid user sales1 45.78.219.95 port 52682 [preauth]
Dec 02 23:54:40 compute-0 nova_compute[187243]: 2025-12-02 23:54:40.573 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:40 compute-0 sshd-session[211193]: Invalid user webuser from 102.210.148.92 port 48418
Dec 02 23:54:41 compute-0 sshd-session[211193]: Received disconnect from 102.210.148.92 port 48418:11: Bye Bye [preauth]
Dec 02 23:54:41 compute-0 sshd-session[211193]: Disconnected from invalid user webuser 102.210.148.92 port 48418 [preauth]
Dec 02 23:54:42 compute-0 podman[211195]: 2025-12-02 23:54:42.110249209 +0000 UTC m=+0.061581645 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 02 23:54:42 compute-0 nova_compute[187243]: 2025-12-02 23:54:42.334 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:43 compute-0 ovn_controller[95488]: 2025-12-02T23:54:43Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f8:84:51 10.100.0.10
Dec 02 23:54:43 compute-0 ovn_controller[95488]: 2025-12-02T23:54:43Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f8:84:51 10.100.0.10
Dec 02 23:54:45 compute-0 nova_compute[187243]: 2025-12-02 23:54:45.578 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:47 compute-0 nova_compute[187243]: 2025-12-02 23:54:47.336 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:50 compute-0 nova_compute[187243]: 2025-12-02 23:54:50.581 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:52 compute-0 podman[211231]: 2025-12-02 23:54:52.142345932 +0000 UTC m=+0.089103153 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 02 23:54:52 compute-0 nova_compute[187243]: 2025-12-02 23:54:52.338 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:52 compute-0 sshd-session[211229]: Invalid user jenkins from 61.220.235.10 port 40810
Dec 02 23:54:53 compute-0 sshd-session[211229]: Received disconnect from 61.220.235.10 port 40810:11: Bye Bye [preauth]
Dec 02 23:54:53 compute-0 sshd-session[211229]: Disconnected from invalid user jenkins 61.220.235.10 port 40810 [preauth]
Dec 02 23:54:55 compute-0 nova_compute[187243]: 2025-12-02 23:54:55.584 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:55 compute-0 podman[211255]: 2025-12-02 23:54:55.717193031 +0000 UTC m=+0.084255376 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 02 23:54:55 compute-0 podman[211256]: 2025-12-02 23:54:55.753049521 +0000 UTC m=+0.125704342 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4)
Dec 02 23:54:57 compute-0 nova_compute[187243]: 2025-12-02 23:54:57.385 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:54:59 compute-0 podman[197600]: time="2025-12-02T23:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:54:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:54:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3056 "" "Go-http-client/1.1"
Dec 02 23:55:00 compute-0 nova_compute[187243]: 2025-12-02 23:55:00.586 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:00.677 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:00.677 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:00.678 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:01 compute-0 openstack_network_exporter[199746]: ERROR   23:55:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:55:01 compute-0 openstack_network_exporter[199746]: ERROR   23:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:55:01 compute-0 openstack_network_exporter[199746]: ERROR   23:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:55:01 compute-0 openstack_network_exporter[199746]: ERROR   23:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:55:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:55:01 compute-0 openstack_network_exporter[199746]: ERROR   23:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:55:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:55:02 compute-0 nova_compute[187243]: 2025-12-02 23:55:02.429 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:05 compute-0 nova_compute[187243]: 2025-12-02 23:55:05.588 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:07 compute-0 nova_compute[187243]: 2025-12-02 23:55:07.432 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:08 compute-0 podman[211301]: 2025-12-02 23:55:08.147064979 +0000 UTC m=+0.092914083 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 02 23:55:10 compute-0 nova_compute[187243]: 2025-12-02 23:55:10.590 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:12 compute-0 nova_compute[187243]: 2025-12-02 23:55:12.471 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:13 compute-0 podman[211324]: 2025-12-02 23:55:13.123174435 +0000 UTC m=+0.068172303 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 02 23:55:14 compute-0 sshd-session[211302]: Invalid user blue from 45.78.222.160 port 49986
Dec 02 23:55:15 compute-0 sshd-session[211302]: Received disconnect from 45.78.222.160 port 49986:11: Bye Bye [preauth]
Dec 02 23:55:15 compute-0 sshd-session[211302]: Disconnected from invalid user blue 45.78.222.160 port 49986 [preauth]
Dec 02 23:55:15 compute-0 nova_compute[187243]: 2025-12-02 23:55:15.594 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:17 compute-0 nova_compute[187243]: 2025-12-02 23:55:17.431 187247 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:55:17 compute-0 nova_compute[187243]: 2025-12-02 23:55:17.432 187247 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:55:17 compute-0 nova_compute[187243]: 2025-12-02 23:55:17.432 187247 DEBUG nova.network.neutron [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:55:17 compute-0 nova_compute[187243]: 2025-12-02 23:55:17.520 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:17 compute-0 nova_compute[187243]: 2025-12-02 23:55:17.939 187247 WARNING neutronclient.v2_0.client [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:18 compute-0 nova_compute[187243]: 2025-12-02 23:55:18.505 187247 WARNING neutronclient.v2_0.client [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:18 compute-0 nova_compute[187243]: 2025-12-02 23:55:18.637 187247 DEBUG nova.network.neutron [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updating instance_info_cache with network_info: [{"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:55:19 compute-0 nova_compute[187243]: 2025-12-02 23:55:19.145 187247 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:55:20 compute-0 nova_compute[187243]: 2025-12-02 23:55:20.597 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:20 compute-0 nova_compute[187243]: 2025-12-02 23:55:20.883 187247 DEBUG nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12417
Dec 02 23:55:20 compute-0 nova_compute[187243]: 2025-12-02 23:55:20.885 187247 DEBUG nova.virt.libvirt.volume.remotefs [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Creating file /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/5720ab648bd44c688df90e86abfdc1b7.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Dec 02 23:55:20 compute-0 nova_compute[187243]: 2025-12-02 23:55:20.885 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/5720ab648bd44c688df90e86abfdc1b7.tmp execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:21 compute-0 nova_compute[187243]: 2025-12-02 23:55:21.407 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/5720ab648bd44c688df90e86abfdc1b7.tmp" returned: 1 in 0.522s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:21 compute-0 nova_compute[187243]: 2025-12-02 23:55:21.409 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/5720ab648bd44c688df90e86abfdc1b7.tmp' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Dec 02 23:55:21 compute-0 nova_compute[187243]: 2025-12-02 23:55:21.410 187247 DEBUG nova.virt.libvirt.volume.remotefs [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Creating directory /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b on remote host 192.168.122.101 create_dir /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Dec 02 23:55:21 compute-0 nova_compute[187243]: 2025-12-02 23:55:21.410 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:21 compute-0 nova_compute[187243]: 2025-12-02 23:55:21.647 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b" returned: 0 in 0.237s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:21 compute-0 nova_compute[187243]: 2025-12-02 23:55:21.654 187247 DEBUG nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4247
Dec 02 23:55:22 compute-0 nova_compute[187243]: 2025-12-02 23:55:22.523 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:22 compute-0 ovn_controller[95488]: 2025-12-02T23:55:22Z|00063|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 02 23:55:23 compute-0 podman[211353]: 2025-12-02 23:55:23.13826 +0000 UTC m=+0.086278732 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:55:23 compute-0 kernel: tapfbb4ca60-8a (unregistering): left promiscuous mode
Dec 02 23:55:23 compute-0 NetworkManager[55671]: <info>  [1764719723.8783] device (tapfbb4ca60-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 02 23:55:23 compute-0 ovn_controller[95488]: 2025-12-02T23:55:23Z|00064|binding|INFO|Releasing lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe from this chassis (sb_readonly=0)
Dec 02 23:55:23 compute-0 ovn_controller[95488]: 2025-12-02T23:55:23Z|00065|binding|INFO|Setting lport fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe down in Southbound
Dec 02 23:55:23 compute-0 ovn_controller[95488]: 2025-12-02T23:55:23Z|00066|binding|INFO|Removing iface tapfbb4ca60-8a ovn-installed in OVS
Dec 02 23:55:23 compute-0 nova_compute[187243]: 2025-12-02 23:55:23.889 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:23 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:23.895 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:84:51 10.100.0.10'], port_security=['fa:16:3e:f8:84:51 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd8ccd45c-e570-4b75-b836-a93e2de1818b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:55:23 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:23.896 104379 INFO neutron.agent.ovn.metadata.agent [-] Port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a unbound from our chassis
Dec 02 23:55:23 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:23.897 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec494140-a5f4-4327-8807-d7248b1cdc9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:55:23 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:23.899 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0b7fb237-ff3b-4d3e-a341-1d6fdc3fc01c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:23 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:23.899 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a namespace which is not needed anymore
Dec 02 23:55:23 compute-0 nova_compute[187243]: 2025-12-02 23:55:23.912 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:23 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Deactivated successfully.
Dec 02 23:55:23 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Consumed 14.925s CPU time.
Dec 02 23:55:23 compute-0 systemd-machined[153518]: Machine qemu-3-instance-00000004 terminated.
Dec 02 23:55:24 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211154]: [NOTICE]   (211158) : haproxy version is 3.0.5-8e879a5
Dec 02 23:55:24 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211154]: [NOTICE]   (211158) : path to executable is /usr/sbin/haproxy
Dec 02 23:55:24 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211154]: [WARNING]  (211158) : Exiting Master process...
Dec 02 23:55:24 compute-0 podman[211401]: 2025-12-02 23:55:24.010418175 +0000 UTC m=+0.030407848 container kill 3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 02 23:55:24 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211154]: [ALERT]    (211158) : Current worker (211160) exited with code 143 (Terminated)
Dec 02 23:55:24 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211154]: [WARNING]  (211158) : All workers exited. Exiting... (0)
Dec 02 23:55:24 compute-0 systemd[1]: libpod-3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd.scope: Deactivated successfully.
Dec 02 23:55:24 compute-0 podman[211416]: 2025-12-02 23:55:24.055423005 +0000 UTC m=+0.025266153 container died 3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 02 23:55:24 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd-userdata-shm.mount: Deactivated successfully.
Dec 02 23:55:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-f1007859202f4275166c5b416d230884c1317792f1758f14b0c61b8214b0ba3d-merged.mount: Deactivated successfully.
Dec 02 23:55:24 compute-0 podman[211416]: 2025-12-02 23:55:24.089658915 +0000 UTC m=+0.059502063 container cleanup 3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 02 23:55:24 compute-0 systemd[1]: libpod-conmon-3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd.scope: Deactivated successfully.
Dec 02 23:55:24 compute-0 podman[211423]: 2025-12-02 23:55:24.116743591 +0000 UTC m=+0.058909378 container remove 3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:55:24 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:24.123 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[63428c44-bbe8-408c-9c7d-1fb27ce0fa9f]: (4, ("Tue Dec  2 11:55:23 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a (3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd)\n3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd\nTue Dec  2 11:55:24 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a (3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd)\n3917454e5ec2a17524395ea1cc8fdc46e27a7af0505a75978bdddd5bb96b3ebd\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:24 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:24.125 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4d8b238b-8f5b-4853-b222-fbf0faf5ad25]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:24 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:24.126 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:55:24 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:24.127 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f363cd0a-48d0-4ec0-8473-9ed4cb1b4d97]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:24 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:24.128 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.129 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.151 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:24 compute-0 kernel: tapec494140-a0: left promiscuous mode
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.157 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:24 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:24.158 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[eb552f4c-9f0e-44eb-8ab2-c4fcd0b06278]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:24 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:24.177 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4681b674-4558-4420-8a24-c0e69e23b306]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:24 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:24.179 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b677bfe3-26bd-407f-bf3a-bab6069a334d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:24 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:24.195 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[24dcd23e-5fa9-4eeb-a5fb-a5257bfc03df]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376124, 'reachable_time': 16117, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211464, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:24 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:24.197 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 02 23:55:24 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:24.197 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[18f65079-88ca-4fec-9eca-7d80970c7bd5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:55:24 compute-0 systemd[1]: run-netns-ovnmeta\x2dec494140\x2da5f4\x2d4327\x2d8807\x2dd7248b1cdc9a.mount: Deactivated successfully.
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.220 187247 DEBUG nova.compute.manager [req-30c9581f-2610-4d8c-ac31-9e128cd495a2 req-2c1f0727-26e5-49a9-8078-f1a02b42d422 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.220 187247 DEBUG oslo_concurrency.lockutils [req-30c9581f-2610-4d8c-ac31-9e128cd495a2 req-2c1f0727-26e5-49a9-8078-f1a02b42d422 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.220 187247 DEBUG oslo_concurrency.lockutils [req-30c9581f-2610-4d8c-ac31-9e128cd495a2 req-2c1f0727-26e5-49a9-8078-f1a02b42d422 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.221 187247 DEBUG oslo_concurrency.lockutils [req-30c9581f-2610-4d8c-ac31-9e128cd495a2 req-2c1f0727-26e5-49a9-8078-f1a02b42d422 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.221 187247 DEBUG nova.compute.manager [req-30c9581f-2610-4d8c-ac31-9e128cd495a2 req-2c1f0727-26e5-49a9-8078-f1a02b42d422 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.221 187247 WARNING nova.compute.manager [req-30c9581f-2610-4d8c-ac31-9e128cd495a2 req-2c1f0727-26e5-49a9-8078-f1a02b42d422 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received unexpected event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with vm_state active and task_state resize_migrating.
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.287 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:24 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:24.288 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:55:24 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:24.288 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.673 187247 INFO nova.virt.libvirt.driver [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Instance shutdown successfully after 3 seconds.
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.678 187247 INFO nova.virt.libvirt.driver [-] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Instance destroyed successfully.
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.679 187247 DEBUG nova.virt.libvirt.vif [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-02T23:54:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-607610768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-607610768',id=4,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:54:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-ziravjgf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:55:12Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=d8ccd45c-e570-4b75-b836-a93e2de1818b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:f8:84:51"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.679 187247 DEBUG nova.network.os_vif_util [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:f8:84:51"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.680 187247 DEBUG nova.network.os_vif_util [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.681 187247 DEBUG os_vif [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.683 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.683 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbb4ca60-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.684 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.685 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.686 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.687 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=46908d5b-b2fd-4426-9859-ca4f295ab544) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.687 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.688 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.690 187247 INFO os_vif [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a')
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.694 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.747 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.748 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.816 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.818 187247 DEBUG nova.virt.libvirt.volume.remotefs [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Copying file /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b_resize/disk to 192.168.122.101:/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Dec 02 23:55:24 compute-0 nova_compute[187243]: 2025-12-02 23:55:24.818 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b_resize/disk 192.168.122.101:/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:25 compute-0 nova_compute[187243]: 2025-12-02 23:55:25.459 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "scp -r /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b_resize/disk 192.168.122.101:/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk" returned: 0 in 0.641s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:25 compute-0 nova_compute[187243]: 2025-12-02 23:55:25.461 187247 DEBUG nova.virt.libvirt.volume.remotefs [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Copying file /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.config copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Dec 02 23:55:25 compute-0 nova_compute[187243]: 2025-12-02 23:55:25.462 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b_resize/disk.config 192.168.122.101:/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.config execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:25 compute-0 nova_compute[187243]: 2025-12-02 23:55:25.598 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:25 compute-0 nova_compute[187243]: 2025-12-02 23:55:25.705 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "scp -C -r /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b_resize/disk.config 192.168.122.101:/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.config" returned: 0 in 0.243s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:25 compute-0 nova_compute[187243]: 2025-12-02 23:55:25.705 187247 DEBUG nova.virt.libvirt.volume.remotefs [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Copying file /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.info copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Dec 02 23:55:25 compute-0 nova_compute[187243]: 2025-12-02 23:55:25.706 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b_resize/disk.info 192.168.122.101:/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.info execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:25 compute-0 nova_compute[187243]: 2025-12-02 23:55:25.914 187247 DEBUG oslo_concurrency.processutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "scp -C -r /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b_resize/disk.info 192.168.122.101:/var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk.info" returned: 0 in 0.208s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:25 compute-0 nova_compute[187243]: 2025-12-02 23:55:25.916 187247 WARNING neutronclient.v2_0.client [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:25 compute-0 nova_compute[187243]: 2025-12-02 23:55:25.917 187247 WARNING neutronclient.v2_0.client [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:26 compute-0 nova_compute[187243]: 2025-12-02 23:55:26.113 187247 DEBUG neutronclient.v2_0.client [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.12/site-packages/neutronclient/v2_0/client.py:265
Dec 02 23:55:26 compute-0 podman[211481]: 2025-12-02 23:55:26.124466223 +0000 UTC m=+0.068971822 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 02 23:55:26 compute-0 podman[211482]: 2025-12-02 23:55:26.164436072 +0000 UTC m=+0.109375601 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 02 23:55:26 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:55:26.289 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:26 compute-0 sshd-session[211475]: Invalid user dev from 49.247.36.49 port 1767
Dec 02 23:55:26 compute-0 nova_compute[187243]: 2025-12-02 23:55:26.347 187247 DEBUG nova.compute.manager [req-11745788-7c4c-41ec-9287-aac17a28a941 req-48a4f336-d486-4171-80ae-c9f53367f292 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:55:26 compute-0 nova_compute[187243]: 2025-12-02 23:55:26.347 187247 DEBUG oslo_concurrency.lockutils [req-11745788-7c4c-41ec-9287-aac17a28a941 req-48a4f336-d486-4171-80ae-c9f53367f292 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:26 compute-0 nova_compute[187243]: 2025-12-02 23:55:26.348 187247 DEBUG oslo_concurrency.lockutils [req-11745788-7c4c-41ec-9287-aac17a28a941 req-48a4f336-d486-4171-80ae-c9f53367f292 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:26 compute-0 nova_compute[187243]: 2025-12-02 23:55:26.348 187247 DEBUG oslo_concurrency.lockutils [req-11745788-7c4c-41ec-9287-aac17a28a941 req-48a4f336-d486-4171-80ae-c9f53367f292 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:26 compute-0 nova_compute[187243]: 2025-12-02 23:55:26.348 187247 DEBUG nova.compute.manager [req-11745788-7c4c-41ec-9287-aac17a28a941 req-48a4f336-d486-4171-80ae-c9f53367f292 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:55:26 compute-0 nova_compute[187243]: 2025-12-02 23:55:26.348 187247 WARNING nova.compute.manager [req-11745788-7c4c-41ec-9287-aac17a28a941 req-48a4f336-d486-4171-80ae-c9f53367f292 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received unexpected event network-vif-unplugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with vm_state active and task_state resize_migrating.
Dec 02 23:55:26 compute-0 sshd-session[211475]: Received disconnect from 49.247.36.49 port 1767:11: Bye Bye [preauth]
Dec 02 23:55:26 compute-0 sshd-session[211475]: Disconnected from invalid user dev 49.247.36.49 port 1767 [preauth]
Dec 02 23:55:27 compute-0 nova_compute[187243]: 2025-12-02 23:55:27.153 187247 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:55:27 compute-0 nova_compute[187243]: 2025-12-02 23:55:27.153 187247 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:55:27 compute-0 nova_compute[187243]: 2025-12-02 23:55:27.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:27 compute-0 nova_compute[187243]: 2025-12-02 23:55:27.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:27 compute-0 nova_compute[187243]: 2025-12-02 23:55:27.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:27 compute-0 nova_compute[187243]: 2025-12-02 23:55:27.593 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:55:27 compute-0 nova_compute[187243]: 2025-12-02 23:55:27.678 187247 INFO nova.compute.rpcapi [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Automatically selected compute RPC version 6.4 from minimum service version 70
Dec 02 23:55:27 compute-0 nova_compute[187243]: 2025-12-02 23:55:27.679 187247 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:55:27 compute-0 nova_compute[187243]: 2025-12-02 23:55:27.739 187247 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:27 compute-0 nova_compute[187243]: 2025-12-02 23:55:27.739 187247 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:27 compute-0 nova_compute[187243]: 2025-12-02 23:55:27.739 187247 DEBUG oslo_concurrency.lockutils [None req-9aa5ca8f-5afa-4ddd-8765-58eee8a1aefe 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:29 compute-0 nova_compute[187243]: 2025-12-02 23:55:29.594 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:29 compute-0 nova_compute[187243]: 2025-12-02 23:55:29.720 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:29 compute-0 podman[197600]: time="2025-12-02T23:55:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:55:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:55:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:55:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:55:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Dec 02 23:55:30 compute-0 nova_compute[187243]: 2025-12-02 23:55:30.370 187247 DEBUG nova.compute.manager [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-changed-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:55:30 compute-0 nova_compute[187243]: 2025-12-02 23:55:30.371 187247 DEBUG nova.compute.manager [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Refreshing instance network info cache due to event network-changed-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:55:30 compute-0 nova_compute[187243]: 2025-12-02 23:55:30.371 187247 DEBUG oslo_concurrency.lockutils [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:55:30 compute-0 nova_compute[187243]: 2025-12-02 23:55:30.372 187247 DEBUG oslo_concurrency.lockutils [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:55:30 compute-0 nova_compute[187243]: 2025-12-02 23:55:30.372 187247 DEBUG nova.network.neutron [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Refreshing network info cache for port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:55:30 compute-0 nova_compute[187243]: 2025-12-02 23:55:30.600 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:30 compute-0 nova_compute[187243]: 2025-12-02 23:55:30.880 187247 WARNING neutronclient.v2_0.client [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:31 compute-0 openstack_network_exporter[199746]: ERROR   23:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:55:31 compute-0 openstack_network_exporter[199746]: ERROR   23:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:55:31 compute-0 openstack_network_exporter[199746]: ERROR   23:55:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:55:31 compute-0 openstack_network_exporter[199746]: ERROR   23:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:55:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:55:31 compute-0 openstack_network_exporter[199746]: ERROR   23:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:55:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:55:31 compute-0 nova_compute[187243]: 2025-12-02 23:55:31.461 187247 WARNING neutronclient.v2_0.client [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:31 compute-0 nova_compute[187243]: 2025-12-02 23:55:31.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:31 compute-0 nova_compute[187243]: 2025-12-02 23:55:31.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:32 compute-0 nova_compute[187243]: 2025-12-02 23:55:32.592 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:32 compute-0 nova_compute[187243]: 2025-12-02 23:55:32.593 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:32 compute-0 nova_compute[187243]: 2025-12-02 23:55:32.593 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:32 compute-0 nova_compute[187243]: 2025-12-02 23:55:32.594 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:55:33 compute-0 nova_compute[187243]: 2025-12-02 23:55:33.113 187247 DEBUG nova.network.neutron [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updated VIF entry in instance network info cache for port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 02 23:55:33 compute-0 nova_compute[187243]: 2025-12-02 23:55:33.114 187247 DEBUG nova.network.neutron [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updating instance_info_cache with network_info: [{"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:55:33 compute-0 nova_compute[187243]: 2025-12-02 23:55:33.743 187247 DEBUG oslo_concurrency.lockutils [req-417684fb-8f62-422d-bb6d-d945bf89e6a6 req-d0ec7373-5fcf-41fb-8cc9-525e2162aed1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:55:33 compute-0 nova_compute[187243]: 2025-12-02 23:55:33.751 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000004, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/d8ccd45c-e570-4b75-b836-a93e2de1818b/disk
Dec 02 23:55:33 compute-0 nova_compute[187243]: 2025-12-02 23:55:33.896 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:55:33 compute-0 nova_compute[187243]: 2025-12-02 23:55:33.897 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:33 compute-0 nova_compute[187243]: 2025-12-02 23:55:33.918 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:33 compute-0 nova_compute[187243]: 2025-12-02 23:55:33.919 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5807MB free_disk=73.13721466064453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:55:33 compute-0 nova_compute[187243]: 2025-12-02 23:55:33.920 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:33 compute-0 nova_compute[187243]: 2025-12-02 23:55:33.920 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:34 compute-0 nova_compute[187243]: 2025-12-02 23:55:34.723 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:34 compute-0 nova_compute[187243]: 2025-12-02 23:55:34.944 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration for instance d8ccd45c-e570-4b75-b836-a93e2de1818b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 02 23:55:35 compute-0 nova_compute[187243]: 2025-12-02 23:55:35.456 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updating resource usage from migration 10d3b043-2ad6-4e69-839b-9c9c56bc0f9a
Dec 02 23:55:35 compute-0 nova_compute[187243]: 2025-12-02 23:55:35.457 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Starting to track outgoing migration 10d3b043-2ad6-4e69-839b-9c9c56bc0f9a with flavor b2669e62-ef04-4b34-b3d6-69efcfbafbdc _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1549
Dec 02 23:55:35 compute-0 nova_compute[187243]: 2025-12-02 23:55:35.498 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration 10d3b043-2ad6-4e69-839b-9c9c56bc0f9a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 02 23:55:35 compute-0 nova_compute[187243]: 2025-12-02 23:55:35.499 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:55:35 compute-0 nova_compute[187243]: 2025-12-02 23:55:35.500 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:55:33 up  1:03,  0 user,  load average: 0.47, 0.40, 0.46\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:55:35 compute-0 nova_compute[187243]: 2025-12-02 23:55:35.549 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:55:35 compute-0 nova_compute[187243]: 2025-12-02 23:55:35.603 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:36 compute-0 nova_compute[187243]: 2025-12-02 23:55:36.064 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:55:36 compute-0 nova_compute[187243]: 2025-12-02 23:55:36.468 187247 DEBUG nova.compute.manager [req-50c05863-6007-4921-b70e-92dc208c433a req-c36d786a-e922-4357-ba87-815dc0734c72 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:55:36 compute-0 nova_compute[187243]: 2025-12-02 23:55:36.469 187247 DEBUG oslo_concurrency.lockutils [req-50c05863-6007-4921-b70e-92dc208c433a req-c36d786a-e922-4357-ba87-815dc0734c72 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:36 compute-0 nova_compute[187243]: 2025-12-02 23:55:36.469 187247 DEBUG oslo_concurrency.lockutils [req-50c05863-6007-4921-b70e-92dc208c433a req-c36d786a-e922-4357-ba87-815dc0734c72 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:36 compute-0 nova_compute[187243]: 2025-12-02 23:55:36.469 187247 DEBUG oslo_concurrency.lockutils [req-50c05863-6007-4921-b70e-92dc208c433a req-c36d786a-e922-4357-ba87-815dc0734c72 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:36 compute-0 nova_compute[187243]: 2025-12-02 23:55:36.470 187247 DEBUG nova.compute.manager [req-50c05863-6007-4921-b70e-92dc208c433a req-c36d786a-e922-4357-ba87-815dc0734c72 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:55:36 compute-0 nova_compute[187243]: 2025-12-02 23:55:36.470 187247 WARNING nova.compute.manager [req-50c05863-6007-4921-b70e-92dc208c433a req-c36d786a-e922-4357-ba87-815dc0734c72 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received unexpected event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with vm_state active and task_state resize_finish.
Dec 02 23:55:36 compute-0 nova_compute[187243]: 2025-12-02 23:55:36.601 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:55:36 compute-0 nova_compute[187243]: 2025-12-02 23:55:36.602 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.682s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:37 compute-0 nova_compute[187243]: 2025-12-02 23:55:37.598 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:37 compute-0 nova_compute[187243]: 2025-12-02 23:55:37.599 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:55:38 compute-0 nova_compute[187243]: 2025-12-02 23:55:38.655 187247 DEBUG nova.compute.manager [req-3102c78d-4338-4d12-81d1-d20c8bbac107 req-1ac0a738-5450-4b24-b01e-973811ff66ce 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:55:38 compute-0 nova_compute[187243]: 2025-12-02 23:55:38.656 187247 DEBUG oslo_concurrency.lockutils [req-3102c78d-4338-4d12-81d1-d20c8bbac107 req-1ac0a738-5450-4b24-b01e-973811ff66ce 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:38 compute-0 nova_compute[187243]: 2025-12-02 23:55:38.656 187247 DEBUG oslo_concurrency.lockutils [req-3102c78d-4338-4d12-81d1-d20c8bbac107 req-1ac0a738-5450-4b24-b01e-973811ff66ce 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:38 compute-0 nova_compute[187243]: 2025-12-02 23:55:38.657 187247 DEBUG oslo_concurrency.lockutils [req-3102c78d-4338-4d12-81d1-d20c8bbac107 req-1ac0a738-5450-4b24-b01e-973811ff66ce 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:38 compute-0 nova_compute[187243]: 2025-12-02 23:55:38.657 187247 DEBUG nova.compute.manager [req-3102c78d-4338-4d12-81d1-d20c8bbac107 req-1ac0a738-5450-4b24-b01e-973811ff66ce 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] No waiting events found dispatching network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:55:38 compute-0 nova_compute[187243]: 2025-12-02 23:55:38.658 187247 WARNING nova.compute.manager [req-3102c78d-4338-4d12-81d1-d20c8bbac107 req-1ac0a738-5450-4b24-b01e-973811ff66ce 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Received unexpected event network-vif-plugged-fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for instance with vm_state resized and task_state None.
Dec 02 23:55:39 compute-0 podman[211526]: 2025-12-02 23:55:39.187993886 +0000 UTC m=+0.133003903 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, release=1755695350, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Dec 02 23:55:39 compute-0 sshd-session[211525]: Invalid user tibero from 20.123.120.169 port 59756
Dec 02 23:55:39 compute-0 nova_compute[187243]: 2025-12-02 23:55:39.531 187247 DEBUG oslo_concurrency.lockutils [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d8ccd45c-e570-4b75-b836-a93e2de1818b" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:39 compute-0 nova_compute[187243]: 2025-12-02 23:55:39.532 187247 DEBUG oslo_concurrency.lockutils [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:39 compute-0 nova_compute[187243]: 2025-12-02 23:55:39.532 187247 DEBUG nova.compute.manager [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Going to confirm migration 1 do_confirm_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:5287
Dec 02 23:55:39 compute-0 sshd-session[211525]: Received disconnect from 20.123.120.169 port 59756:11: Bye Bye [preauth]
Dec 02 23:55:39 compute-0 sshd-session[211525]: Disconnected from invalid user tibero 20.123.120.169 port 59756 [preauth]
Dec 02 23:55:39 compute-0 nova_compute[187243]: 2025-12-02 23:55:39.725 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:40 compute-0 nova_compute[187243]: 2025-12-02 23:55:40.077 187247 DEBUG nova.objects.instance [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'info_cache' on Instance uuid d8ccd45c-e570-4b75-b836-a93e2de1818b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:55:40 compute-0 nova_compute[187243]: 2025-12-02 23:55:40.605 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:40 compute-0 nova_compute[187243]: 2025-12-02 23:55:40.668 187247 WARNING neutronclient.v2_0.client [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:41 compute-0 nova_compute[187243]: 2025-12-02 23:55:41.136 187247 WARNING neutronclient.v2_0.client [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:41 compute-0 nova_compute[187243]: 2025-12-02 23:55:41.136 187247 WARNING neutronclient.v2_0.client [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:41 compute-0 nova_compute[187243]: 2025-12-02 23:55:41.303 187247 DEBUG neutronclient.v2_0.client [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.12/site-packages/neutronclient/v2_0/client.py:265
Dec 02 23:55:41 compute-0 nova_compute[187243]: 2025-12-02 23:55:41.304 187247 DEBUG oslo_concurrency.lockutils [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:55:41 compute-0 nova_compute[187243]: 2025-12-02 23:55:41.304 187247 DEBUG oslo_concurrency.lockutils [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:55:41 compute-0 nova_compute[187243]: 2025-12-02 23:55:41.305 187247 DEBUG nova.network.neutron [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:55:41 compute-0 nova_compute[187243]: 2025-12-02 23:55:41.853 187247 WARNING neutronclient.v2_0.client [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:42 compute-0 nova_compute[187243]: 2025-12-02 23:55:42.596 187247 WARNING neutronclient.v2_0.client [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:42 compute-0 nova_compute[187243]: 2025-12-02 23:55:42.772 187247 DEBUG nova.network.neutron [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d8ccd45c-e570-4b75-b836-a93e2de1818b] Updating instance_info_cache with network_info: [{"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.300 187247 DEBUG oslo_concurrency.lockutils [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-d8ccd45c-e570-4b75-b836-a93e2de1818b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.300 187247 DEBUG nova.objects.instance [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid d8ccd45c-e570-4b75-b836-a93e2de1818b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.431 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.431 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.817 187247 DEBUG nova.objects.base [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<d8ccd45c-e570-4b75-b836-a93e2de1818b> lazy-loaded attributes: info_cache,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.830 187247 DEBUG nova.virt.libvirt.vif [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-02T23:54:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-607610768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-607610768',id=4,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:55:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-ziravjgf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:55:37Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=d8ccd45c-e570-4b75-b836-a93e2de1818b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.831 187247 DEBUG nova.network.os_vif_util [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "address": "fa:16:3e:f8:84:51", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbb4ca60-8a", "ovs_interfaceid": "fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.831 187247 DEBUG nova.network.os_vif_util [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.832 187247 DEBUG os_vif [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.834 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.834 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbb4ca60-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.834 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.836 187247 INFO os_vif [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:84:51,bridge_name='br-int',has_traffic_filtering=True,id=fbb4ca60-8a2d-4c3a-a71c-a81ac8833afe,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbb4ca60-8a')
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.837 187247 DEBUG oslo_concurrency.lockutils [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.837 187247 DEBUG oslo_concurrency.lockutils [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:43 compute-0 nova_compute[187243]: 2025-12-02 23:55:43.938 187247 DEBUG nova.compute.manager [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 02 23:55:44 compute-0 podman[211549]: 2025-12-02 23:55:44.131730786 +0000 UTC m=+0.079243342 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible)
Dec 02 23:55:44 compute-0 nova_compute[187243]: 2025-12-02 23:55:44.435 187247 DEBUG nova.compute.provider_tree [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:55:44 compute-0 nova_compute[187243]: 2025-12-02 23:55:44.545 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:44 compute-0 nova_compute[187243]: 2025-12-02 23:55:44.727 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:44 compute-0 nova_compute[187243]: 2025-12-02 23:55:44.943 187247 DEBUG nova.scheduler.client.report [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:55:45 compute-0 nova_compute[187243]: 2025-12-02 23:55:45.607 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:46 compute-0 nova_compute[187243]: 2025-12-02 23:55:46.051 187247 DEBUG oslo_concurrency.lockutils [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 2.214s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:46 compute-0 nova_compute[187243]: 2025-12-02 23:55:46.054 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 1.509s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:46 compute-0 nova_compute[187243]: 2025-12-02 23:55:46.062 187247 DEBUG nova.virt.hardware [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 02 23:55:46 compute-0 nova_compute[187243]: 2025-12-02 23:55:46.063 187247 INFO nova.compute.claims [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Claim successful on node compute-0.ctlplane.example.com
Dec 02 23:55:46 compute-0 nova_compute[187243]: 2025-12-02 23:55:46.728 187247 INFO nova.scheduler.client.report [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration 10d3b043-2ad6-4e69-839b-9c9c56bc0f9a
Dec 02 23:55:47 compute-0 nova_compute[187243]: 2025-12-02 23:55:47.194 187247 DEBUG nova.compute.provider_tree [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:55:47 compute-0 nova_compute[187243]: 2025-12-02 23:55:47.293 187247 DEBUG oslo_concurrency.lockutils [None req-c14a2d71-05a9-4a4c-be24-8feeada29975 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d8ccd45c-e570-4b75-b836-a93e2de1818b" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 7.762s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:48 compute-0 nova_compute[187243]: 2025-12-02 23:55:48.109 187247 DEBUG nova.scheduler.client.report [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:55:48 compute-0 nova_compute[187243]: 2025-12-02 23:55:48.657 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.602s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:48 compute-0 nova_compute[187243]: 2025-12-02 23:55:48.658 187247 DEBUG nova.compute.manager [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 02 23:55:49 compute-0 nova_compute[187243]: 2025-12-02 23:55:49.749 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:50 compute-0 nova_compute[187243]: 2025-12-02 23:55:50.018 187247 DEBUG nova.compute.manager [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 02 23:55:50 compute-0 nova_compute[187243]: 2025-12-02 23:55:50.019 187247 DEBUG nova.network.neutron [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 02 23:55:50 compute-0 nova_compute[187243]: 2025-12-02 23:55:50.019 187247 WARNING neutronclient.v2_0.client [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:50 compute-0 nova_compute[187243]: 2025-12-02 23:55:50.020 187247 WARNING neutronclient.v2_0.client [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:50 compute-0 nova_compute[187243]: 2025-12-02 23:55:50.587 187247 INFO nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 02 23:55:50 compute-0 nova_compute[187243]: 2025-12-02 23:55:50.610 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:51 compute-0 nova_compute[187243]: 2025-12-02 23:55:51.177 187247 DEBUG nova.compute.manager [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.149 187247 DEBUG nova.network.neutron [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Successfully created port: aa1a4037-7471-48e2-8297-5aeb45672ebb _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.201 187247 DEBUG nova.compute.manager [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.202 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.202 187247 INFO nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Creating image(s)
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.203 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.203 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.204 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.204 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.207 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.209 187247 DEBUG oslo_concurrency.processutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.264 187247 DEBUG oslo_concurrency.processutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.265 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.265 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.266 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.268 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.269 187247 DEBUG oslo_concurrency.processutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.323 187247 DEBUG oslo_concurrency.processutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.324 187247 DEBUG oslo_concurrency.processutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.366 187247 DEBUG oslo_concurrency.processutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.367 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.368 187247 DEBUG oslo_concurrency.processutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.447 187247 DEBUG oslo_concurrency.processutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.448 187247 DEBUG nova.virt.disk.api [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Checking if we can resize image /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.448 187247 DEBUG oslo_concurrency.processutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.502 187247 DEBUG oslo_concurrency.processutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.503 187247 DEBUG nova.virt.disk.api [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Cannot resize image /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.504 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.504 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Ensure instance console log exists: /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.504 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.505 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:52 compute-0 nova_compute[187243]: 2025-12-02 23:55:52.505 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:52 compute-0 sshd-session[211570]: Invalid user dangulo from 102.210.148.92 port 47598
Dec 02 23:55:52 compute-0 sshd-session[211570]: Received disconnect from 102.210.148.92 port 47598:11: Bye Bye [preauth]
Dec 02 23:55:52 compute-0 sshd-session[211570]: Disconnected from invalid user dangulo 102.210.148.92 port 47598 [preauth]
Dec 02 23:55:54 compute-0 podman[211587]: 2025-12-02 23:55:54.09694436 +0000 UTC m=+0.054109102 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:55:54 compute-0 nova_compute[187243]: 2025-12-02 23:55:54.200 187247 DEBUG nova.network.neutron [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Successfully updated port: aa1a4037-7471-48e2-8297-5aeb45672ebb _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 02 23:55:54 compute-0 nova_compute[187243]: 2025-12-02 23:55:54.272 187247 DEBUG nova.compute.manager [req-8f5b1f45-10e7-4d8c-93ef-c9775a8ef955 req-9edd83ae-c748-419e-8a82-86f76e12a951 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-changed-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:55:54 compute-0 nova_compute[187243]: 2025-12-02 23:55:54.273 187247 DEBUG nova.compute.manager [req-8f5b1f45-10e7-4d8c-93ef-c9775a8ef955 req-9edd83ae-c748-419e-8a82-86f76e12a951 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Refreshing instance network info cache due to event network-changed-aa1a4037-7471-48e2-8297-5aeb45672ebb. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:55:54 compute-0 nova_compute[187243]: 2025-12-02 23:55:54.273 187247 DEBUG oslo_concurrency.lockutils [req-8f5b1f45-10e7-4d8c-93ef-c9775a8ef955 req-9edd83ae-c748-419e-8a82-86f76e12a951 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:55:54 compute-0 nova_compute[187243]: 2025-12-02 23:55:54.273 187247 DEBUG oslo_concurrency.lockutils [req-8f5b1f45-10e7-4d8c-93ef-c9775a8ef955 req-9edd83ae-c748-419e-8a82-86f76e12a951 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:55:54 compute-0 nova_compute[187243]: 2025-12-02 23:55:54.273 187247 DEBUG nova.network.neutron [req-8f5b1f45-10e7-4d8c-93ef-c9775a8ef955 req-9edd83ae-c748-419e-8a82-86f76e12a951 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Refreshing network info cache for port aa1a4037-7471-48e2-8297-5aeb45672ebb _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:55:54 compute-0 nova_compute[187243]: 2025-12-02 23:55:54.707 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:55:54 compute-0 nova_compute[187243]: 2025-12-02 23:55:54.751 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:54 compute-0 nova_compute[187243]: 2025-12-02 23:55:54.779 187247 WARNING neutronclient.v2_0.client [req-8f5b1f45-10e7-4d8c-93ef-c9775a8ef955 req-9edd83ae-c748-419e-8a82-86f76e12a951 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:55 compute-0 nova_compute[187243]: 2025-12-02 23:55:55.140 187247 DEBUG nova.network.neutron [req-8f5b1f45-10e7-4d8c-93ef-c9775a8ef955 req-9edd83ae-c748-419e-8a82-86f76e12a951 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:55:55 compute-0 nova_compute[187243]: 2025-12-02 23:55:55.373 187247 DEBUG nova.network.neutron [req-8f5b1f45-10e7-4d8c-93ef-c9775a8ef955 req-9edd83ae-c748-419e-8a82-86f76e12a951 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:55:55 compute-0 nova_compute[187243]: 2025-12-02 23:55:55.612 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:55 compute-0 nova_compute[187243]: 2025-12-02 23:55:55.880 187247 DEBUG oslo_concurrency.lockutils [req-8f5b1f45-10e7-4d8c-93ef-c9775a8ef955 req-9edd83ae-c748-419e-8a82-86f76e12a951 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:55:55 compute-0 nova_compute[187243]: 2025-12-02 23:55:55.880 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquired lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:55:55 compute-0 nova_compute[187243]: 2025-12-02 23:55:55.881 187247 DEBUG nova.network.neutron [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:55:57 compute-0 podman[211611]: 2025-12-02 23:55:57.08336506 +0000 UTC m=+0.044740706 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Dec 02 23:55:57 compute-0 podman[211612]: 2025-12-02 23:55:57.123363539 +0000 UTC m=+0.077043998 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Dec 02 23:55:57 compute-0 nova_compute[187243]: 2025-12-02 23:55:57.127 187247 DEBUG nova.network.neutron [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:55:57 compute-0 nova_compute[187243]: 2025-12-02 23:55:57.361 187247 WARNING neutronclient.v2_0.client [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.053 187247 DEBUG nova.network.neutron [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Updating instance_info_cache with network_info: [{"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.566 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Releasing lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.567 187247 DEBUG nova.compute.manager [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Instance network_info: |[{"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.570 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Start _get_guest_xml network_info=[{"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.573 187247 WARNING nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.575 187247 DEBUG nova.virt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-116577734', uuid='d66a42a4-6bab-485d-a45f-0df43bf25d1b'), owner=OwnerMeta(userid='d31b8a74cb3c48f3b147970eec936bca', username='tempest-TestExecuteActionsViaActuator-1889160444-project-admin', projectid='5f2368878ee9447ea8fcef9927711e2d', projectname='tempest-TestExecuteActionsViaActuator-1889160444'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764719758.5751393) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.582 187247 DEBUG nova.virt.libvirt.host [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.582 187247 DEBUG nova.virt.libvirt.host [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.587 187247 DEBUG nova.virt.libvirt.host [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.587 187247 DEBUG nova.virt.libvirt.host [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.588 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.588 187247 DEBUG nova.virt.hardware [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.589 187247 DEBUG nova.virt.hardware [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.589 187247 DEBUG nova.virt.hardware [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.589 187247 DEBUG nova.virt.hardware [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.589 187247 DEBUG nova.virt.hardware [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.590 187247 DEBUG nova.virt.hardware [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.590 187247 DEBUG nova.virt.hardware [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.590 187247 DEBUG nova.virt.hardware [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.590 187247 DEBUG nova.virt.hardware [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.590 187247 DEBUG nova.virt.hardware [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.590 187247 DEBUG nova.virt.hardware [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.593 187247 DEBUG nova.virt.libvirt.vif [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-02T23:55:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-116577734',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-116577734',id=6,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-dt7jcyvd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:55:51Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=d66a42a4-6bab-485d-a45f-0df43bf25d1b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.594 187247 DEBUG nova.network.os_vif_util [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.594 187247 DEBUG nova.network.os_vif_util [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:55:58 compute-0 nova_compute[187243]: 2025-12-02 23:55:58.595 187247 DEBUG nova.objects.instance [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lazy-loading 'pci_devices' on Instance uuid d66a42a4-6bab-485d-a45f-0df43bf25d1b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.147 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] End _get_guest_xml xml=<domain type="kvm">
Dec 02 23:55:59 compute-0 nova_compute[187243]:   <uuid>d66a42a4-6bab-485d-a45f-0df43bf25d1b</uuid>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   <name>instance-00000006</name>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   <metadata>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-116577734</nova:name>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-02 23:55:58</nova:creationTime>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 02 23:55:59 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:properties>
Dec 02 23:55:59 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         </nova:properties>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       </nova:image>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <nova:owner>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:user uuid="d31b8a74cb3c48f3b147970eec936bca">tempest-TestExecuteActionsViaActuator-1889160444-project-admin</nova:user>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:project uuid="5f2368878ee9447ea8fcef9927711e2d">tempest-TestExecuteActionsViaActuator-1889160444</nova:project>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       </nova:owner>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <nova:ports>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         <nova:port uuid="aa1a4037-7471-48e2-8297-5aeb45672ebb">
Dec 02 23:55:59 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:         </nova:port>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       </nova:ports>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     </nova:instance>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   </metadata>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <system>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <entry name="serial">d66a42a4-6bab-485d-a45f-0df43bf25d1b</entry>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <entry name="uuid">d66a42a4-6bab-485d-a45f-0df43bf25d1b</entry>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     </system>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   </sysinfo>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   <os>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   </os>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   <features>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <acpi/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <apic/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   </features>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   </clock>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   </cpu>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   <devices>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk.config"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:fd:d7:48"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <target dev="tapaa1a4037-74"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     </interface>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/console.log" append="off"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     </serial>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <video>
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     </video>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     </rng>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:55:59 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 02 23:55:59 compute-0 nova_compute[187243]:     </memballoon>
Dec 02 23:55:59 compute-0 nova_compute[187243]:   </devices>
Dec 02 23:55:59 compute-0 nova_compute[187243]: </domain>
Dec 02 23:55:59 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.148 187247 DEBUG nova.compute.manager [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Preparing to wait for external event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.148 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.148 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.148 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.149 187247 DEBUG nova.virt.libvirt.vif [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-02T23:55:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-116577734',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-116577734',id=6,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-dt7jcyvd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:55:51Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=d66a42a4-6bab-485d-a45f-0df43bf25d1b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.149 187247 DEBUG nova.network.os_vif_util [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.150 187247 DEBUG nova.network.os_vif_util [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.150 187247 DEBUG os_vif [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.150 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.151 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.151 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.151 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.151 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'f72fb5b5-47db-50dd-881f-4c2fc19fab24', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.152 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.153 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.155 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.155 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa1a4037-74, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.156 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapaa1a4037-74, col_values=(('qos', UUID('ffdd2017-1a07-4d10-ada3-3b9ccda9c5a1')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.156 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapaa1a4037-74, col_values=(('external_ids', {'iface-id': 'aa1a4037-7471-48e2-8297-5aeb45672ebb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:d7:48', 'vm-uuid': 'd66a42a4-6bab-485d-a45f-0df43bf25d1b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.157 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:59 compute-0 NetworkManager[55671]: <info>  [1764719759.1580] manager: (tapaa1a4037-74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.159 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.162 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:55:59 compute-0 nova_compute[187243]: 2025-12-02 23:55:59.163 187247 INFO os_vif [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74')
Dec 02 23:55:59 compute-0 podman[197600]: time="2025-12-02T23:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:55:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:55:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Dec 02 23:56:00 compute-0 nova_compute[187243]: 2025-12-02 23:56:00.613 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:00.679 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:00.679 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:00.679 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:00 compute-0 nova_compute[187243]: 2025-12-02 23:56:00.751 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:56:00 compute-0 nova_compute[187243]: 2025-12-02 23:56:00.751 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:56:00 compute-0 nova_compute[187243]: 2025-12-02 23:56:00.752 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No VIF found with MAC fa:16:3e:fd:d7:48, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 02 23:56:00 compute-0 nova_compute[187243]: 2025-12-02 23:56:00.752 187247 INFO nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Using config drive
Dec 02 23:56:01 compute-0 nova_compute[187243]: 2025-12-02 23:56:01.285 187247 WARNING neutronclient.v2_0.client [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:56:01 compute-0 openstack_network_exporter[199746]: ERROR   23:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:56:01 compute-0 openstack_network_exporter[199746]: ERROR   23:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:56:01 compute-0 openstack_network_exporter[199746]: ERROR   23:56:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:56:01 compute-0 openstack_network_exporter[199746]: ERROR   23:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:56:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:56:01 compute-0 openstack_network_exporter[199746]: ERROR   23:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:56:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:56:01 compute-0 sshd-session[211656]: Invalid user admin from 80.94.95.116 port 25564
Dec 02 23:56:01 compute-0 sshd-session[211656]: Connection closed by invalid user admin 80.94.95.116 port 25564 [preauth]
Dec 02 23:56:01 compute-0 nova_compute[187243]: 2025-12-02 23:56:01.997 187247 INFO nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Creating config drive at /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk.config
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.002 187247 DEBUG oslo_concurrency.processutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpbmcbq73i execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.140 187247 DEBUG oslo_concurrency.processutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpbmcbq73i" returned: 0 in 0.138s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:02 compute-0 kernel: tapaa1a4037-74: entered promiscuous mode
Dec 02 23:56:02 compute-0 NetworkManager[55671]: <info>  [1764719762.2247] manager: (tapaa1a4037-74): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Dec 02 23:56:02 compute-0 ovn_controller[95488]: 2025-12-02T23:56:02Z|00067|binding|INFO|Claiming lport aa1a4037-7471-48e2-8297-5aeb45672ebb for this chassis.
Dec 02 23:56:02 compute-0 ovn_controller[95488]: 2025-12-02T23:56:02Z|00068|binding|INFO|aa1a4037-7471-48e2-8297-5aeb45672ebb: Claiming fa:16:3e:fd:d7:48 10.100.0.12
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.225 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.240 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:d7:48 10.100.0.12'], port_security=['fa:16:3e:fd:d7:48 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd66a42a4-6bab-485d-a45f-0df43bf25d1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=aa1a4037-7471-48e2-8297-5aeb45672ebb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.241 104379 INFO neutron.agent.ovn.metadata.agent [-] Port aa1a4037-7471-48e2-8297-5aeb45672ebb in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a bound to our chassis
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.243 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.242 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:02 compute-0 ovn_controller[95488]: 2025-12-02T23:56:02Z|00069|binding|INFO|Setting lport aa1a4037-7471-48e2-8297-5aeb45672ebb ovn-installed in OVS
Dec 02 23:56:02 compute-0 ovn_controller[95488]: 2025-12-02T23:56:02Z|00070|binding|INFO|Setting lport aa1a4037-7471-48e2-8297-5aeb45672ebb up in Southbound
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.246 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.258 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ac354654-d69d-4397-816d-263941c979b4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.258 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapec494140-a1 in ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.260 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapec494140-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.260 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[aa78b8de-6dab-4954-a25a-4c786f46c6cf]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.260 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[98dc2bac-b4d4-44ab-8e27-be4f55a1aa18]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 systemd-machined[153518]: New machine qemu-4-instance-00000006.
Dec 02 23:56:02 compute-0 systemd-udevd[211679]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.274 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[8d363484-c675-451a-b4d5-633645b7abfc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 NetworkManager[55671]: <info>  [1764719762.2842] device (tapaa1a4037-74): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:56:02 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000006.
Dec 02 23:56:02 compute-0 NetworkManager[55671]: <info>  [1764719762.2860] device (tapaa1a4037-74): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.290 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d6889047-fa22-49da-8562-a665b58da894]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.319 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[23eb08bb-dd31-43be-bb10-7f44f4bc37a2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 systemd-udevd[211682]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.323 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[42f4ba6e-7d2b-4a15-ba6b-deccaef5822a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 NetworkManager[55671]: <info>  [1764719762.3252] manager: (tapec494140-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/34)
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.354 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[ec2b610e-986d-4af6-87d9-4524d8702390]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.356 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[1d5fa5c6-0b3e-451b-a789-b20a232c7cd0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 NetworkManager[55671]: <info>  [1764719762.3764] device (tapec494140-a0): carrier: link connected
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.382 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[45d7d3e7-25f1-4c65-bf97-fec9c5589c8c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.397 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0725e384-3229-473d-bc5b-47a4fdbf717f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385302, 'reachable_time': 20390, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211710, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.410 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d9d94cb7-626e-4e2a-bd7d-d18a297ff832]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feab:f8c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385302, 'tstamp': 385302}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211711, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.426 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e388b01f-5d38-4afd-bb88-80a946c523c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385302, 'reachable_time': 20390, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211712, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.427 187247 DEBUG nova.compute.manager [req-30973249-a309-4ef1-b420-dc86200d1dc5 req-83e03909-837b-47a5-9a69-40b4746d897b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.428 187247 DEBUG oslo_concurrency.lockutils [req-30973249-a309-4ef1-b420-dc86200d1dc5 req-83e03909-837b-47a5-9a69-40b4746d897b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.428 187247 DEBUG oslo_concurrency.lockutils [req-30973249-a309-4ef1-b420-dc86200d1dc5 req-83e03909-837b-47a5-9a69-40b4746d897b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.428 187247 DEBUG oslo_concurrency.lockutils [req-30973249-a309-4ef1-b420-dc86200d1dc5 req-83e03909-837b-47a5-9a69-40b4746d897b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.429 187247 DEBUG nova.compute.manager [req-30973249-a309-4ef1-b420-dc86200d1dc5 req-83e03909-837b-47a5-9a69-40b4746d897b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Processing event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.453 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[34841d22-54fd-4a87-8374-6f7e4154f89b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.522 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1a5a2b91-179a-4002-801f-a1fdf53d29a6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.524 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.524 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.524 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.526 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:02 compute-0 NetworkManager[55671]: <info>  [1764719762.5271] manager: (tapec494140-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Dec 02 23:56:02 compute-0 kernel: tapec494140-a0: entered promiscuous mode
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.530 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.531 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.532 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:02 compute-0 ovn_controller[95488]: 2025-12-02T23:56:02Z|00071|binding|INFO|Releasing lport 9ee451cb-cc6e-44d6-98fb-cdfa0566e521 from this chassis (sb_readonly=0)
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.547 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:02 compute-0 nova_compute[187243]: 2025-12-02 23:56:02.549 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.550 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7ad5e910-6159-4fb3-a667-0c9152d033a6]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.550 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.551 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.551 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for ec494140-a5f4-4327-8807-d7248b1cdc9a disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.551 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.551 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[20e945eb-0ef5-401f-91b1-abf4926db4cf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.551 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.552 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[baba2fe5-fa91-447a-bd8e-2def5bd08d81]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.552 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: global
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: defaults
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     log global
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 02 23:56:02 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:02.552 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'env', 'PROCESS_TAG=haproxy-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ec494140-a5f4-4327-8807-d7248b1cdc9a.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 02 23:56:02 compute-0 podman[211744]: 2025-12-02 23:56:02.949963552 +0000 UTC m=+0.050580976 container create cbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 02 23:56:02 compute-0 systemd[1]: Started libpod-conmon-cbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131.scope.
Dec 02 23:56:03 compute-0 podman[211744]: 2025-12-02 23:56:02.921538513 +0000 UTC m=+0.022155957 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 02 23:56:03 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:56:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96cf13fd13da20b3e5390d38a40da03343604dce6347c3b582255b3ea079878c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 23:56:03 compute-0 podman[211744]: 2025-12-02 23:56:03.04231626 +0000 UTC m=+0.142933684 container init cbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 23:56:03 compute-0 podman[211744]: 2025-12-02 23:56:03.049365991 +0000 UTC m=+0.149983415 container start cbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 02 23:56:03 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211760]: [NOTICE]   (211770) : New worker (211772) forked
Dec 02 23:56:03 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211760]: [NOTICE]   (211770) : Loading success.
Dec 02 23:56:03 compute-0 nova_compute[187243]: 2025-12-02 23:56:03.118 187247 DEBUG nova.compute.manager [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 02 23:56:03 compute-0 nova_compute[187243]: 2025-12-02 23:56:03.123 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 02 23:56:03 compute-0 nova_compute[187243]: 2025-12-02 23:56:03.126 187247 INFO nova.virt.libvirt.driver [-] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Instance spawned successfully.
Dec 02 23:56:03 compute-0 nova_compute[187243]: 2025-12-02 23:56:03.127 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 02 23:56:03 compute-0 nova_compute[187243]: 2025-12-02 23:56:03.641 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:03 compute-0 nova_compute[187243]: 2025-12-02 23:56:03.642 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:03 compute-0 nova_compute[187243]: 2025-12-02 23:56:03.643 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:03 compute-0 nova_compute[187243]: 2025-12-02 23:56:03.644 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:03 compute-0 nova_compute[187243]: 2025-12-02 23:56:03.644 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:03 compute-0 nova_compute[187243]: 2025-12-02 23:56:03.645 187247 DEBUG nova.virt.libvirt.driver [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:04 compute-0 nova_compute[187243]: 2025-12-02 23:56:04.156 187247 INFO nova.compute.manager [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Took 11.95 seconds to spawn the instance on the hypervisor.
Dec 02 23:56:04 compute-0 nova_compute[187243]: 2025-12-02 23:56:04.158 187247 DEBUG nova.compute.manager [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 02 23:56:04 compute-0 nova_compute[187243]: 2025-12-02 23:56:04.159 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:04 compute-0 nova_compute[187243]: 2025-12-02 23:56:04.542 187247 DEBUG nova.compute.manager [req-b9c64f6d-1dd2-4e9e-a4fa-6b8aa80111bc req-beaa7875-aee4-4cfc-bbf0-c8d32ba71b41 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:56:04 compute-0 nova_compute[187243]: 2025-12-02 23:56:04.542 187247 DEBUG oslo_concurrency.lockutils [req-b9c64f6d-1dd2-4e9e-a4fa-6b8aa80111bc req-beaa7875-aee4-4cfc-bbf0-c8d32ba71b41 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:04 compute-0 nova_compute[187243]: 2025-12-02 23:56:04.543 187247 DEBUG oslo_concurrency.lockutils [req-b9c64f6d-1dd2-4e9e-a4fa-6b8aa80111bc req-beaa7875-aee4-4cfc-bbf0-c8d32ba71b41 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:04 compute-0 nova_compute[187243]: 2025-12-02 23:56:04.544 187247 DEBUG oslo_concurrency.lockutils [req-b9c64f6d-1dd2-4e9e-a4fa-6b8aa80111bc req-beaa7875-aee4-4cfc-bbf0-c8d32ba71b41 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:04 compute-0 nova_compute[187243]: 2025-12-02 23:56:04.544 187247 DEBUG nova.compute.manager [req-b9c64f6d-1dd2-4e9e-a4fa-6b8aa80111bc req-beaa7875-aee4-4cfc-bbf0-c8d32ba71b41 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] No waiting events found dispatching network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:56:04 compute-0 nova_compute[187243]: 2025-12-02 23:56:04.545 187247 WARNING nova.compute.manager [req-b9c64f6d-1dd2-4e9e-a4fa-6b8aa80111bc req-beaa7875-aee4-4cfc-bbf0-c8d32ba71b41 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received unexpected event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb for instance with vm_state active and task_state None.
Dec 02 23:56:04 compute-0 nova_compute[187243]: 2025-12-02 23:56:04.691 187247 INFO nova.compute.manager [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Took 20.21 seconds to build instance.
Dec 02 23:56:05 compute-0 nova_compute[187243]: 2025-12-02 23:56:05.196 187247 DEBUG oslo_concurrency.lockutils [None req-9ec175a9-db6f-4c78-bdc7-a15cde845a9b d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.765s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:05 compute-0 nova_compute[187243]: 2025-12-02 23:56:05.616 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:05 compute-0 sshd-session[211782]: Invalid user userb from 45.78.219.213 port 57050
Dec 02 23:56:06 compute-0 sshd-session[211782]: Received disconnect from 45.78.219.213 port 57050:11: Bye Bye [preauth]
Dec 02 23:56:06 compute-0 sshd-session[211782]: Disconnected from invalid user userb 45.78.219.213 port 57050 [preauth]
Dec 02 23:56:09 compute-0 nova_compute[187243]: 2025-12-02 23:56:09.184 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:10 compute-0 podman[211784]: 2025-12-02 23:56:10.1405869 +0000 UTC m=+0.090632527 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, io.openshift.expose-services=, release=1755695350, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Dec 02 23:56:10 compute-0 nova_compute[187243]: 2025-12-02 23:56:10.632 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:14 compute-0 nova_compute[187243]: 2025-12-02 23:56:14.226 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:15 compute-0 podman[211815]: 2025-12-02 23:56:15.157887741 +0000 UTC m=+0.101712676 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Dec 02 23:56:15 compute-0 nova_compute[187243]: 2025-12-02 23:56:15.635 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:16 compute-0 ovn_controller[95488]: 2025-12-02T23:56:16Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fd:d7:48 10.100.0.12
Dec 02 23:56:16 compute-0 ovn_controller[95488]: 2025-12-02T23:56:16Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fd:d7:48 10.100.0.12
Dec 02 23:56:19 compute-0 nova_compute[187243]: 2025-12-02 23:56:19.271 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:20 compute-0 sshd-session[211835]: Received disconnect from 61.220.235.10 port 39988:11: Bye Bye [preauth]
Dec 02 23:56:20 compute-0 sshd-session[211835]: Disconnected from authenticating user root 61.220.235.10 port 39988 [preauth]
Dec 02 23:56:20 compute-0 nova_compute[187243]: 2025-12-02 23:56:20.637 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:24 compute-0 nova_compute[187243]: 2025-12-02 23:56:24.276 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:24 compute-0 podman[211843]: 2025-12-02 23:56:24.906081398 +0000 UTC m=+0.097243788 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 02 23:56:25 compute-0 nova_compute[187243]: 2025-12-02 23:56:25.640 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:27 compute-0 nova_compute[187243]: 2025-12-02 23:56:27.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:27 compute-0 nova_compute[187243]: 2025-12-02 23:56:27.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:56:28 compute-0 podman[211867]: 2025-12-02 23:56:28.130249527 +0000 UTC m=+0.075938921 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4)
Dec 02 23:56:28 compute-0 podman[211868]: 2025-12-02 23:56:28.17203712 +0000 UTC m=+0.120336607 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:56:28 compute-0 nova_compute[187243]: 2025-12-02 23:56:28.594 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:29 compute-0 nova_compute[187243]: 2025-12-02 23:56:29.280 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:29 compute-0 nova_compute[187243]: 2025-12-02 23:56:29.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:29 compute-0 podman[197600]: time="2025-12-02T23:56:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:56:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:56:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:56:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:56:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3056 "" "Go-http-client/1.1"
Dec 02 23:56:30 compute-0 nova_compute[187243]: 2025-12-02 23:56:30.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:30 compute-0 nova_compute[187243]: 2025-12-02 23:56:30.642 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:30 compute-0 sshd-session[211839]: Connection closed by 101.47.140.127 port 39828 [preauth]
Dec 02 23:56:30 compute-0 sshd-session[211841]: Connection closed by 45.78.218.154 port 56716 [preauth]
Dec 02 23:56:31 compute-0 openstack_network_exporter[199746]: ERROR   23:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:56:31 compute-0 openstack_network_exporter[199746]: ERROR   23:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:56:31 compute-0 openstack_network_exporter[199746]: ERROR   23:56:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:56:31 compute-0 openstack_network_exporter[199746]: ERROR   23:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:56:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:56:31 compute-0 openstack_network_exporter[199746]: ERROR   23:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:56:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:56:32 compute-0 nova_compute[187243]: 2025-12-02 23:56:32.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:33 compute-0 nova_compute[187243]: 2025-12-02 23:56:33.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:33 compute-0 nova_compute[187243]: 2025-12-02 23:56:33.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:33 compute-0 nova_compute[187243]: 2025-12-02 23:56:33.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:34 compute-0 nova_compute[187243]: 2025-12-02 23:56:34.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:34 compute-0 nova_compute[187243]: 2025-12-02 23:56:34.106 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:34 compute-0 nova_compute[187243]: 2025-12-02 23:56:34.106 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:34 compute-0 nova_compute[187243]: 2025-12-02 23:56:34.106 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:56:34 compute-0 nova_compute[187243]: 2025-12-02 23:56:34.283 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:35 compute-0 nova_compute[187243]: 2025-12-02 23:56:35.152 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:35 compute-0 nova_compute[187243]: 2025-12-02 23:56:35.204 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:35 compute-0 nova_compute[187243]: 2025-12-02 23:56:35.205 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:35 compute-0 nova_compute[187243]: 2025-12-02 23:56:35.297 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:35 compute-0 nova_compute[187243]: 2025-12-02 23:56:35.450 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:56:35 compute-0 nova_compute[187243]: 2025-12-02 23:56:35.451 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:35 compute-0 nova_compute[187243]: 2025-12-02 23:56:35.484 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:35 compute-0 nova_compute[187243]: 2025-12-02 23:56:35.485 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5656MB free_disk=73.13732147216797GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:56:35 compute-0 nova_compute[187243]: 2025-12-02 23:56:35.485 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:35 compute-0 nova_compute[187243]: 2025-12-02 23:56:35.486 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:35 compute-0 nova_compute[187243]: 2025-12-02 23:56:35.645 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:36 compute-0 nova_compute[187243]: 2025-12-02 23:56:36.546 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance d66a42a4-6bab-485d-a45f-0df43bf25d1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 02 23:56:36 compute-0 nova_compute[187243]: 2025-12-02 23:56:36.546 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:56:36 compute-0 nova_compute[187243]: 2025-12-02 23:56:36.546 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:56:35 up  1:04,  0 user,  load average: 0.48, 0.41, 0.46\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_5f2368878ee9447ea8fcef9927711e2d': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:56:36 compute-0 nova_compute[187243]: 2025-12-02 23:56:36.562 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing inventories for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 02 23:56:36 compute-0 nova_compute[187243]: 2025-12-02 23:56:36.581 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating ProviderTree inventory for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 02 23:56:36 compute-0 nova_compute[187243]: 2025-12-02 23:56:36.582 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 02 23:56:36 compute-0 nova_compute[187243]: 2025-12-02 23:56:36.601 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing aggregate associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 02 23:56:36 compute-0 nova_compute[187243]: 2025-12-02 23:56:36.624 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing trait associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_SOUND_MODEL_AC97,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOUND_MODEL_ICH9,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 02 23:56:36 compute-0 nova_compute[187243]: 2025-12-02 23:56:36.675 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:56:37 compute-0 nova_compute[187243]: 2025-12-02 23:56:37.190 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:56:37 compute-0 nova_compute[187243]: 2025-12-02 23:56:37.700 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:56:37 compute-0 nova_compute[187243]: 2025-12-02 23:56:37.701 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.215s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:38 compute-0 nova_compute[187243]: 2025-12-02 23:56:38.117 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:38 compute-0 nova_compute[187243]: 2025-12-02 23:56:38.117 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:38 compute-0 nova_compute[187243]: 2025-12-02 23:56:38.625 187247 DEBUG nova.compute.manager [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 02 23:56:39 compute-0 nova_compute[187243]: 2025-12-02 23:56:39.174 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:39 compute-0 nova_compute[187243]: 2025-12-02 23:56:39.174 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:39 compute-0 nova_compute[187243]: 2025-12-02 23:56:39.180 187247 DEBUG nova.virt.hardware [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 02 23:56:39 compute-0 nova_compute[187243]: 2025-12-02 23:56:39.180 187247 INFO nova.compute.claims [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Claim successful on node compute-0.ctlplane.example.com
Dec 02 23:56:39 compute-0 nova_compute[187243]: 2025-12-02 23:56:39.285 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:40 compute-0 nova_compute[187243]: 2025-12-02 23:56:40.263 187247 DEBUG nova.compute.provider_tree [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:56:40 compute-0 nova_compute[187243]: 2025-12-02 23:56:40.647 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:40 compute-0 nova_compute[187243]: 2025-12-02 23:56:40.776 187247 DEBUG nova.scheduler.client.report [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:56:41 compute-0 podman[211921]: 2025-12-02 23:56:41.103721401 +0000 UTC m=+0.063270944 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 02 23:56:41 compute-0 nova_compute[187243]: 2025-12-02 23:56:41.287 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:41 compute-0 nova_compute[187243]: 2025-12-02 23:56:41.288 187247 DEBUG nova.compute.manager [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 02 23:56:41 compute-0 nova_compute[187243]: 2025-12-02 23:56:41.797 187247 DEBUG nova.compute.manager [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 02 23:56:41 compute-0 nova_compute[187243]: 2025-12-02 23:56:41.797 187247 DEBUG nova.network.neutron [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 02 23:56:41 compute-0 nova_compute[187243]: 2025-12-02 23:56:41.798 187247 WARNING neutronclient.v2_0.client [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:56:41 compute-0 nova_compute[187243]: 2025-12-02 23:56:41.798 187247 WARNING neutronclient.v2_0.client [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:56:42 compute-0 nova_compute[187243]: 2025-12-02 23:56:42.304 187247 INFO nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 02 23:56:42 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:42.652 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:56:42 compute-0 nova_compute[187243]: 2025-12-02 23:56:42.652 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:42 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:42.654 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:56:42 compute-0 nova_compute[187243]: 2025-12-02 23:56:42.874 187247 DEBUG nova.network.neutron [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Successfully created port: 933e46ed-57a7-472a-adf9-eff09ae7c559 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 02 23:56:43 compute-0 nova_compute[187243]: 2025-12-02 23:56:43.063 187247 DEBUG nova.compute.manager [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 02 23:56:43 compute-0 nova_compute[187243]: 2025-12-02 23:56:43.697 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.288 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.293 187247 DEBUG nova.compute.manager [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.294 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.295 187247 INFO nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Creating image(s)
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.295 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.295 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.296 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.297 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.301 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.306 187247 DEBUG oslo_concurrency.processutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.390 187247 DEBUG oslo_concurrency.processutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.391 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.392 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.392 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.395 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.396 187247 DEBUG oslo_concurrency.processutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.458 187247 DEBUG oslo_concurrency.processutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.459 187247 DEBUG oslo_concurrency.processutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.489 187247 DEBUG oslo_concurrency.processutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.490 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.490 187247 DEBUG oslo_concurrency.processutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.539 187247 DEBUG oslo_concurrency.processutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.540 187247 DEBUG nova.virt.disk.api [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Checking if we can resize image /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.540 187247 DEBUG oslo_concurrency.processutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.588 187247 DEBUG oslo_concurrency.processutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.588 187247 DEBUG nova.virt.disk.api [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Cannot resize image /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.589 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.589 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Ensure instance console log exists: /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.589 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.590 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.590 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.830 187247 DEBUG nova.network.neutron [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Successfully updated port: 933e46ed-57a7-472a-adf9-eff09ae7c559 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.897 187247 DEBUG nova.compute.manager [req-7371cfd8-094c-497b-99ab-4f733aa62203 req-9edfcd0d-9f87-4536-8063-e7c27b48985d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-changed-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.897 187247 DEBUG nova.compute.manager [req-7371cfd8-094c-497b-99ab-4f733aa62203 req-9edfcd0d-9f87-4536-8063-e7c27b48985d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Refreshing instance network info cache due to event network-changed-933e46ed-57a7-472a-adf9-eff09ae7c559. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.897 187247 DEBUG oslo_concurrency.lockutils [req-7371cfd8-094c-497b-99ab-4f733aa62203 req-9edfcd0d-9f87-4536-8063-e7c27b48985d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.897 187247 DEBUG oslo_concurrency.lockutils [req-7371cfd8-094c-497b-99ab-4f733aa62203 req-9edfcd0d-9f87-4536-8063-e7c27b48985d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:56:44 compute-0 nova_compute[187243]: 2025-12-02 23:56:44.897 187247 DEBUG nova.network.neutron [req-7371cfd8-094c-497b-99ab-4f733aa62203 req-9edfcd0d-9f87-4536-8063-e7c27b48985d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Refreshing network info cache for port 933e46ed-57a7-472a-adf9-eff09ae7c559 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:56:45 compute-0 nova_compute[187243]: 2025-12-02 23:56:45.336 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:56:45 compute-0 nova_compute[187243]: 2025-12-02 23:56:45.403 187247 WARNING neutronclient.v2_0.client [req-7371cfd8-094c-497b-99ab-4f733aa62203 req-9edfcd0d-9f87-4536-8063-e7c27b48985d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:56:45 compute-0 nova_compute[187243]: 2025-12-02 23:56:45.649 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:46 compute-0 podman[211958]: 2025-12-02 23:56:46.102720961 +0000 UTC m=+0.062527697 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Dec 02 23:56:46 compute-0 nova_compute[187243]: 2025-12-02 23:56:46.156 187247 DEBUG nova.network.neutron [req-7371cfd8-094c-497b-99ab-4f733aa62203 req-9edfcd0d-9f87-4536-8063-e7c27b48985d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:56:46 compute-0 nova_compute[187243]: 2025-12-02 23:56:46.374 187247 DEBUG nova.network.neutron [req-7371cfd8-094c-497b-99ab-4f733aa62203 req-9edfcd0d-9f87-4536-8063-e7c27b48985d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:56:46 compute-0 nova_compute[187243]: 2025-12-02 23:56:46.882 187247 DEBUG oslo_concurrency.lockutils [req-7371cfd8-094c-497b-99ab-4f733aa62203 req-9edfcd0d-9f87-4536-8063-e7c27b48985d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:56:46 compute-0 nova_compute[187243]: 2025-12-02 23:56:46.882 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquired lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:56:46 compute-0 nova_compute[187243]: 2025-12-02 23:56:46.883 187247 DEBUG nova.network.neutron [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:56:47 compute-0 nova_compute[187243]: 2025-12-02 23:56:47.480 187247 DEBUG nova.network.neutron [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.024 187247 WARNING neutronclient.v2_0.client [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.233 187247 DEBUG nova.network.neutron [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating instance_info_cache with network_info: [{"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.741 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Releasing lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.742 187247 DEBUG nova.compute.manager [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Instance network_info: |[{"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.746 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Start _get_guest_xml network_info=[{"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.757 187247 WARNING nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.759 187247 DEBUG nova.virt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-508367334', uuid='35a3db0d-2b6a-47be-bc85-4b164026935c'), owner=OwnerMeta(userid='d31b8a74cb3c48f3b147970eec936bca', username='tempest-TestExecuteActionsViaActuator-1889160444-project-admin', projectid='5f2368878ee9447ea8fcef9927711e2d', projectname='tempest-TestExecuteActionsViaActuator-1889160444'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764719808.7592773) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.775 187247 DEBUG nova.virt.libvirt.host [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.776 187247 DEBUG nova.virt.libvirt.host [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.788 187247 DEBUG nova.virt.libvirt.host [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.789 187247 DEBUG nova.virt.libvirt.host [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.790 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.791 187247 DEBUG nova.virt.hardware [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.792 187247 DEBUG nova.virt.hardware [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.792 187247 DEBUG nova.virt.hardware [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.793 187247 DEBUG nova.virt.hardware [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.793 187247 DEBUG nova.virt.hardware [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.794 187247 DEBUG nova.virt.hardware [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.794 187247 DEBUG nova.virt.hardware [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.795 187247 DEBUG nova.virt.hardware [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.795 187247 DEBUG nova.virt.hardware [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.795 187247 DEBUG nova.virt.hardware [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.796 187247 DEBUG nova.virt.hardware [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.802 187247 DEBUG nova.virt.libvirt.vif [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-02T23:56:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-508367334',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-508367334',id=8,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-q02f1mi0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:56:43Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=35a3db0d-2b6a-47be-bc85-4b164026935c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.803 187247 DEBUG nova.network.os_vif_util [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.804 187247 DEBUG nova.network.os_vif_util [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:56:48 compute-0 nova_compute[187243]: 2025-12-02 23:56:48.806 187247 DEBUG nova.objects.instance [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lazy-loading 'pci_devices' on Instance uuid 35a3db0d-2b6a-47be-bc85-4b164026935c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.291 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.314 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] End _get_guest_xml xml=<domain type="kvm">
Dec 02 23:56:49 compute-0 nova_compute[187243]:   <uuid>35a3db0d-2b6a-47be-bc85-4b164026935c</uuid>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   <name>instance-00000008</name>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   <metadata>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-508367334</nova:name>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-02 23:56:48</nova:creationTime>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 02 23:56:49 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:properties>
Dec 02 23:56:49 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         </nova:properties>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       </nova:image>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <nova:owner>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:user uuid="d31b8a74cb3c48f3b147970eec936bca">tempest-TestExecuteActionsViaActuator-1889160444-project-admin</nova:user>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:project uuid="5f2368878ee9447ea8fcef9927711e2d">tempest-TestExecuteActionsViaActuator-1889160444</nova:project>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       </nova:owner>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <nova:ports>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         <nova:port uuid="933e46ed-57a7-472a-adf9-eff09ae7c559">
Dec 02 23:56:49 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:         </nova:port>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       </nova:ports>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     </nova:instance>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   </metadata>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <system>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <entry name="serial">35a3db0d-2b6a-47be-bc85-4b164026935c</entry>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <entry name="uuid">35a3db0d-2b6a-47be-bc85-4b164026935c</entry>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     </system>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   </sysinfo>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   <os>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   </os>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   <features>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <acpi/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <apic/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   </features>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   </clock>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   </cpu>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   <devices>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.config"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:2d:bc:aa"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <target dev="tap933e46ed-57"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     </interface>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/console.log" append="off"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     </serial>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <video>
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     </video>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     </rng>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:56:49 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 02 23:56:49 compute-0 nova_compute[187243]:     </memballoon>
Dec 02 23:56:49 compute-0 nova_compute[187243]:   </devices>
Dec 02 23:56:49 compute-0 nova_compute[187243]: </domain>
Dec 02 23:56:49 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.316 187247 DEBUG nova.compute.manager [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Preparing to wait for external event network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.316 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.317 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.317 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.319 187247 DEBUG nova.virt.libvirt.vif [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-02T23:56:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-508367334',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-508367334',id=8,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-q02f1mi0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:56:43Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=35a3db0d-2b6a-47be-bc85-4b164026935c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.320 187247 DEBUG nova.network.os_vif_util [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converting VIF {"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.321 187247 DEBUG nova.network.os_vif_util [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.321 187247 DEBUG os_vif [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.323 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.323 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.324 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.325 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.326 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c58cc7f7-bcdb-594b-9781-ef98323345a6', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.328 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.329 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.335 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.335 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap933e46ed-57, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.336 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap933e46ed-57, col_values=(('qos', UUID('bd6e2a54-7c5f-4436-965b-ad5a9707e345')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.337 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap933e46ed-57, col_values=(('external_ids', {'iface-id': '933e46ed-57a7-472a-adf9-eff09ae7c559', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:bc:aa', 'vm-uuid': '35a3db0d-2b6a-47be-bc85-4b164026935c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.338 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:49 compute-0 NetworkManager[55671]: <info>  [1764719809.3399] manager: (tap933e46ed-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.342 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.349 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:49 compute-0 nova_compute[187243]: 2025-12-02 23:56:49.350 187247 INFO os_vif [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57')
Dec 02 23:56:50 compute-0 nova_compute[187243]: 2025-12-02 23:56:50.652 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:50 compute-0 nova_compute[187243]: 2025-12-02 23:56:50.902 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:56:50 compute-0 nova_compute[187243]: 2025-12-02 23:56:50.902 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:56:50 compute-0 nova_compute[187243]: 2025-12-02 23:56:50.902 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] No VIF found with MAC fa:16:3e:2d:bc:aa, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 02 23:56:50 compute-0 nova_compute[187243]: 2025-12-02 23:56:50.903 187247 INFO nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Using config drive
Dec 02 23:56:51 compute-0 nova_compute[187243]: 2025-12-02 23:56:51.412 187247 WARNING neutronclient.v2_0.client [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:56:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:51.655 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:51 compute-0 nova_compute[187243]: 2025-12-02 23:56:51.777 187247 INFO nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Creating config drive at /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.config
Dec 02 23:56:51 compute-0 nova_compute[187243]: 2025-12-02 23:56:51.789 187247 DEBUG oslo_concurrency.processutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpe88eaoq1 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:56:51 compute-0 nova_compute[187243]: 2025-12-02 23:56:51.929 187247 DEBUG oslo_concurrency.processutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpe88eaoq1" returned: 0 in 0.141s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:56:51 compute-0 kernel: tap933e46ed-57: entered promiscuous mode
Dec 02 23:56:51 compute-0 NetworkManager[55671]: <info>  [1764719811.9845] manager: (tap933e46ed-57): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Dec 02 23:56:51 compute-0 ovn_controller[95488]: 2025-12-02T23:56:51Z|00072|binding|INFO|Claiming lport 933e46ed-57a7-472a-adf9-eff09ae7c559 for this chassis.
Dec 02 23:56:51 compute-0 ovn_controller[95488]: 2025-12-02T23:56:51Z|00073|binding|INFO|933e46ed-57a7-472a-adf9-eff09ae7c559: Claiming fa:16:3e:2d:bc:aa 10.100.0.9
Dec 02 23:56:51 compute-0 nova_compute[187243]: 2025-12-02 23:56:51.986 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:51.997 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:bc:aa 10.100.0.9'], port_security=['fa:16:3e:2d:bc:aa 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '35a3db0d-2b6a-47be-bc85-4b164026935c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=933e46ed-57a7-472a-adf9-eff09ae7c559) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:56:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:51.998 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 933e46ed-57a7-472a-adf9-eff09ae7c559 in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a bound to our chassis
Dec 02 23:56:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:52.000 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.000 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:52 compute-0 ovn_controller[95488]: 2025-12-02T23:56:52Z|00074|binding|INFO|Setting lport 933e46ed-57a7-472a-adf9-eff09ae7c559 ovn-installed in OVS
Dec 02 23:56:52 compute-0 ovn_controller[95488]: 2025-12-02T23:56:52Z|00075|binding|INFO|Setting lport 933e46ed-57a7-472a-adf9-eff09ae7c559 up in Southbound
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.007 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:52 compute-0 systemd-machined[153518]: New machine qemu-5-instance-00000008.
Dec 02 23:56:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:52.022 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[579f0abc-7a29-4092-90fe-601b7eb633ba]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:52 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000008.
Dec 02 23:56:52 compute-0 systemd-udevd[212001]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:56:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:52.063 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[a51d516b-f893-4240-ad4d-07ee300d0275]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:52 compute-0 NetworkManager[55671]: <info>  [1764719812.0669] device (tap933e46ed-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:56:52 compute-0 NetworkManager[55671]: <info>  [1764719812.0678] device (tap933e46ed-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 02 23:56:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:52.067 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[0fc0b016-df76-4718-9de9-8c8c29cf7244]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:52.105 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[dc15fe70-50a2-42c9-828d-207b8f4ce21c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:52.130 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e0121c8b-f63c-48b3-95a8-4571c95b0790]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385302, 'reachable_time': 32877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212011, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.145 187247 DEBUG nova.compute.manager [req-5f93c474-fd84-479f-a5c1-7d9786bc2ec5 req-4a17b1d0-becb-4216-b925-c211cf928d86 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.145 187247 DEBUG oslo_concurrency.lockutils [req-5f93c474-fd84-479f-a5c1-7d9786bc2ec5 req-4a17b1d0-becb-4216-b925-c211cf928d86 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.145 187247 DEBUG oslo_concurrency.lockutils [req-5f93c474-fd84-479f-a5c1-7d9786bc2ec5 req-4a17b1d0-becb-4216-b925-c211cf928d86 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.145 187247 DEBUG oslo_concurrency.lockutils [req-5f93c474-fd84-479f-a5c1-7d9786bc2ec5 req-4a17b1d0-becb-4216-b925-c211cf928d86 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.145 187247 DEBUG nova.compute.manager [req-5f93c474-fd84-479f-a5c1-7d9786bc2ec5 req-4a17b1d0-becb-4216-b925-c211cf928d86 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Processing event network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 02 23:56:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:52.156 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[44b2e874-d261-45b1-bbb2-259cce30cb04]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385312, 'tstamp': 385312}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212015, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385315, 'tstamp': 385315}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212015, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:52.157 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.199 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.200 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:52.201 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:52.201 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:56:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:52.202 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:56:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:52.202 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:56:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:56:52.205 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[86e1661b-77da-40a2-a187-0c794cfc74ff]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.441 187247 DEBUG nova.compute.manager [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.445 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.448 187247 INFO nova.virt.libvirt.driver [-] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Instance spawned successfully.
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.448 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.964 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.965 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.966 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.967 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.967 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:52 compute-0 nova_compute[187243]: 2025-12-02 23:56:52.968 187247 DEBUG nova.virt.libvirt.driver [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:56:53 compute-0 sshd-session[212013]: Invalid user azureuser from 49.247.36.49 port 55511
Dec 02 23:56:53 compute-0 sshd-session[212013]: Received disconnect from 49.247.36.49 port 55511:11: Bye Bye [preauth]
Dec 02 23:56:53 compute-0 sshd-session[212013]: Disconnected from invalid user azureuser 49.247.36.49 port 55511 [preauth]
Dec 02 23:56:53 compute-0 nova_compute[187243]: 2025-12-02 23:56:53.479 187247 INFO nova.compute.manager [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Took 9.19 seconds to spawn the instance on the hypervisor.
Dec 02 23:56:53 compute-0 nova_compute[187243]: 2025-12-02 23:56:53.479 187247 DEBUG nova.compute.manager [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 02 23:56:54 compute-0 nova_compute[187243]: 2025-12-02 23:56:54.014 187247 INFO nova.compute.manager [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Took 14.88 seconds to build instance.
Dec 02 23:56:54 compute-0 nova_compute[187243]: 2025-12-02 23:56:54.251 187247 DEBUG nova.compute.manager [req-e1deb1be-ac67-4989-a6b8-45b8e63ae828 req-dc147b4f-6371-4305-96db-4ba360d5727a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:56:54 compute-0 nova_compute[187243]: 2025-12-02 23:56:54.252 187247 DEBUG oslo_concurrency.lockutils [req-e1deb1be-ac67-4989-a6b8-45b8e63ae828 req-dc147b4f-6371-4305-96db-4ba360d5727a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:56:54 compute-0 nova_compute[187243]: 2025-12-02 23:56:54.252 187247 DEBUG oslo_concurrency.lockutils [req-e1deb1be-ac67-4989-a6b8-45b8e63ae828 req-dc147b4f-6371-4305-96db-4ba360d5727a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:56:54 compute-0 nova_compute[187243]: 2025-12-02 23:56:54.252 187247 DEBUG oslo_concurrency.lockutils [req-e1deb1be-ac67-4989-a6b8-45b8e63ae828 req-dc147b4f-6371-4305-96db-4ba360d5727a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:54 compute-0 nova_compute[187243]: 2025-12-02 23:56:54.253 187247 DEBUG nova.compute.manager [req-e1deb1be-ac67-4989-a6b8-45b8e63ae828 req-dc147b4f-6371-4305-96db-4ba360d5727a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] No waiting events found dispatching network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:56:54 compute-0 nova_compute[187243]: 2025-12-02 23:56:54.253 187247 WARNING nova.compute.manager [req-e1deb1be-ac67-4989-a6b8-45b8e63ae828 req-dc147b4f-6371-4305-96db-4ba360d5727a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received unexpected event network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 for instance with vm_state active and task_state None.
Dec 02 23:56:54 compute-0 nova_compute[187243]: 2025-12-02 23:56:54.382 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:54 compute-0 nova_compute[187243]: 2025-12-02 23:56:54.520 187247 DEBUG oslo_concurrency.lockutils [None req-c699bd4a-06e5-4fbc-b61a-24d96d943ece d31b8a74cb3c48f3b147970eec936bca 5f2368878ee9447ea8fcef9927711e2d - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.403s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:56:55 compute-0 podman[212023]: 2025-12-02 23:56:55.150297457 +0000 UTC m=+0.086423715 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:56:55 compute-0 nova_compute[187243]: 2025-12-02 23:56:55.654 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:57 compute-0 sshd-session[212046]: Invalid user kiosk from 20.123.120.169 port 37620
Dec 02 23:56:57 compute-0 sshd-session[212046]: Received disconnect from 20.123.120.169 port 37620:11: Bye Bye [preauth]
Dec 02 23:56:57 compute-0 sshd-session[212046]: Disconnected from invalid user kiosk 20.123.120.169 port 37620 [preauth]
Dec 02 23:56:59 compute-0 podman[212048]: 2025-12-02 23:56:59.093254076 +0000 UTC m=+0.051156031 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec 02 23:56:59 compute-0 podman[212049]: 2025-12-02 23:56:59.122924925 +0000 UTC m=+0.078100304 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Dec 02 23:56:59 compute-0 nova_compute[187243]: 2025-12-02 23:56:59.385 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:56:59 compute-0 podman[197600]: time="2025-12-02T23:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:56:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:56:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3052 "" "Go-http-client/1.1"
Dec 02 23:57:00 compute-0 nova_compute[187243]: 2025-12-02 23:57:00.656 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:00.680 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:00.680 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:00.680 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:01 compute-0 openstack_network_exporter[199746]: ERROR   23:57:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:57:01 compute-0 openstack_network_exporter[199746]: ERROR   23:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:57:01 compute-0 openstack_network_exporter[199746]: ERROR   23:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:57:01 compute-0 openstack_network_exporter[199746]: ERROR   23:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:57:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:57:01 compute-0 openstack_network_exporter[199746]: ERROR   23:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:57:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:57:02 compute-0 sshd-session[212094]: Received disconnect from 102.210.148.92 port 45952:11: Bye Bye [preauth]
Dec 02 23:57:02 compute-0 sshd-session[212094]: Disconnected from authenticating user root 102.210.148.92 port 45952 [preauth]
Dec 02 23:57:03 compute-0 sshd-session[212105]: Invalid user nominatim from 23.95.37.90 port 39414
Dec 02 23:57:03 compute-0 sshd-session[212105]: Received disconnect from 23.95.37.90 port 39414:11: Bye Bye [preauth]
Dec 02 23:57:03 compute-0 sshd-session[212105]: Disconnected from invalid user nominatim 23.95.37.90 port 39414 [preauth]
Dec 02 23:57:03 compute-0 ovn_controller[95488]: 2025-12-02T23:57:03Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2d:bc:aa 10.100.0.9
Dec 02 23:57:03 compute-0 ovn_controller[95488]: 2025-12-02T23:57:03Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:bc:aa 10.100.0.9
Dec 02 23:57:04 compute-0 nova_compute[187243]: 2025-12-02 23:57:04.429 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:05 compute-0 nova_compute[187243]: 2025-12-02 23:57:05.658 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:09 compute-0 nova_compute[187243]: 2025-12-02 23:57:09.467 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:10 compute-0 nova_compute[187243]: 2025-12-02 23:57:10.692 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:12 compute-0 podman[212117]: 2025-12-02 23:57:12.138490585 +0000 UTC m=+0.083416071 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 02 23:57:14 compute-0 nova_compute[187243]: 2025-12-02 23:57:14.513 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:15 compute-0 nova_compute[187243]: 2025-12-02 23:57:15.725 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:17 compute-0 podman[212138]: 2025-12-02 23:57:17.092623719 +0000 UTC m=+0.056596219 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 02 23:57:19 compute-0 nova_compute[187243]: 2025-12-02 23:57:19.546 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:20 compute-0 nova_compute[187243]: 2025-12-02 23:57:20.770 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:24 compute-0 nova_compute[187243]: 2025-12-02 23:57:24.609 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:25 compute-0 nova_compute[187243]: 2025-12-02 23:57:25.804 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:26 compute-0 podman[212159]: 2025-12-02 23:57:26.104523063 +0000 UTC m=+0.062566336 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:57:26 compute-0 nova_compute[187243]: 2025-12-02 23:57:26.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:26 compute-0 nova_compute[187243]: 2025-12-02 23:57:26.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 02 23:57:28 compute-0 nova_compute[187243]: 2025-12-02 23:57:28.129 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:28 compute-0 nova_compute[187243]: 2025-12-02 23:57:28.130 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:57:29 compute-0 nova_compute[187243]: 2025-12-02 23:57:29.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:29 compute-0 nova_compute[187243]: 2025-12-02 23:57:29.611 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:29 compute-0 podman[197600]: time="2025-12-02T23:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:57:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:57:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3053 "" "Go-http-client/1.1"
Dec 02 23:57:30 compute-0 podman[212187]: 2025-12-02 23:57:30.090351822 +0000 UTC m=+0.049974637 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.4)
Dec 02 23:57:30 compute-0 podman[212188]: 2025-12-02 23:57:30.157049068 +0000 UTC m=+0.096649552 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 02 23:57:30 compute-0 nova_compute[187243]: 2025-12-02 23:57:30.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:30 compute-0 nova_compute[187243]: 2025-12-02 23:57:30.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:30 compute-0 nova_compute[187243]: 2025-12-02 23:57:30.593 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 02 23:57:30 compute-0 nova_compute[187243]: 2025-12-02 23:57:30.806 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:31 compute-0 nova_compute[187243]: 2025-12-02 23:57:31.100 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 02 23:57:31 compute-0 nova_compute[187243]: 2025-12-02 23:57:31.101 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:31 compute-0 openstack_network_exporter[199746]: ERROR   23:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:57:31 compute-0 openstack_network_exporter[199746]: ERROR   23:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:57:31 compute-0 openstack_network_exporter[199746]: ERROR   23:57:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:57:31 compute-0 openstack_network_exporter[199746]: ERROR   23:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:57:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:57:31 compute-0 openstack_network_exporter[199746]: ERROR   23:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:57:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:57:33 compute-0 nova_compute[187243]: 2025-12-02 23:57:33.606 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:33 compute-0 nova_compute[187243]: 2025-12-02 23:57:33.607 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:33 compute-0 nova_compute[187243]: 2025-12-02 23:57:33.607 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:34 compute-0 nova_compute[187243]: 2025-12-02 23:57:34.414 187247 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Check if temp file /var/lib/nova/instances/tmpz8owyb_p exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 02 23:57:34 compute-0 nova_compute[187243]: 2025-12-02 23:57:34.421 187247 DEBUG nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpz8owyb_p',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d66a42a4-6bab-485d-a45f-0df43bf25d1b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 02 23:57:34 compute-0 nova_compute[187243]: 2025-12-02 23:57:34.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:34 compute-0 nova_compute[187243]: 2025-12-02 23:57:34.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:57:34 compute-0 nova_compute[187243]: 2025-12-02 23:57:34.651 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:35 compute-0 nova_compute[187243]: 2025-12-02 23:57:35.107 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:35 compute-0 nova_compute[187243]: 2025-12-02 23:57:35.107 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:35 compute-0 nova_compute[187243]: 2025-12-02 23:57:35.107 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:35 compute-0 nova_compute[187243]: 2025-12-02 23:57:35.107 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:57:35 compute-0 nova_compute[187243]: 2025-12-02 23:57:35.843 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.184 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.274 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.276 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.361 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.369 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.458 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.460 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.525 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.737 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.740 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.763 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.763 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5507MB free_disk=73.10836029052734GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.764 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:36 compute-0 nova_compute[187243]: 2025-12-02 23:57:36.764 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:37 compute-0 nova_compute[187243]: 2025-12-02 23:57:37.773 187247 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:57:37 compute-0 nova_compute[187243]: 2025-12-02 23:57:37.774 187247 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:57:37 compute-0 nova_compute[187243]: 2025-12-02 23:57:37.774 187247 DEBUG nova.network.neutron [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:57:37 compute-0 nova_compute[187243]: 2025-12-02 23:57:37.785 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Updating resource usage from migration eb5aabe1-cb88-4102-8f99-b6fe8e8e8562
Dec 02 23:57:37 compute-0 nova_compute[187243]: 2025-12-02 23:57:37.786 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating resource usage from migration c53386d3-3d97-4c78-a0cf-66ce9d67e567
Dec 02 23:57:37 compute-0 nova_compute[187243]: 2025-12-02 23:57:37.823 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration eb5aabe1-cb88-4102-8f99-b6fe8e8e8562 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 02 23:57:37 compute-0 nova_compute[187243]: 2025-12-02 23:57:37.823 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration c53386d3-3d97-4c78-a0cf-66ce9d67e567 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 02 23:57:37 compute-0 nova_compute[187243]: 2025-12-02 23:57:37.824 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:57:37 compute-0 nova_compute[187243]: 2025-12-02 23:57:37.824 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:57:36 up  1:05,  0 user,  load average: 0.52, 0.43, 0.46\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_migrating': '1', 'num_os_type_None': '2', 'num_proj_5f2368878ee9447ea8fcef9927711e2d': '2', 'io_workload': '1', 'num_task_resize_prep': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:57:37 compute-0 nova_compute[187243]: 2025-12-02 23:57:37.903 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:57:38 compute-0 nova_compute[187243]: 2025-12-02 23:57:38.283 187247 WARNING neutronclient.v2_0.client [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:38 compute-0 nova_compute[187243]: 2025-12-02 23:57:38.412 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:57:38 compute-0 nova_compute[187243]: 2025-12-02 23:57:38.579 187247 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:38 compute-0 nova_compute[187243]: 2025-12-02 23:57:38.633 187247 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:38 compute-0 nova_compute[187243]: 2025-12-02 23:57:38.634 187247 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:38 compute-0 nova_compute[187243]: 2025-12-02 23:57:38.689 187247 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:38 compute-0 nova_compute[187243]: 2025-12-02 23:57:38.690 187247 DEBUG nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Preparing to wait for external event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 02 23:57:38 compute-0 nova_compute[187243]: 2025-12-02 23:57:38.691 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:38 compute-0 nova_compute[187243]: 2025-12-02 23:57:38.691 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:38 compute-0 nova_compute[187243]: 2025-12-02 23:57:38.691 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:38 compute-0 nova_compute[187243]: 2025-12-02 23:57:38.923 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:57:38 compute-0 nova_compute[187243]: 2025-12-02 23:57:38.924 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.160s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:39 compute-0 nova_compute[187243]: 2025-12-02 23:57:39.653 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:40 compute-0 nova_compute[187243]: 2025-12-02 23:57:40.155 187247 WARNING neutronclient.v2_0.client [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:40 compute-0 nova_compute[187243]: 2025-12-02 23:57:40.382 187247 DEBUG nova.network.neutron [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating instance_info_cache with network_info: [{"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:57:40 compute-0 nova_compute[187243]: 2025-12-02 23:57:40.845 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:40 compute-0 nova_compute[187243]: 2025-12-02 23:57:40.888 187247 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:57:42 compute-0 nova_compute[187243]: 2025-12-02 23:57:42.423 187247 DEBUG nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12417
Dec 02 23:57:42 compute-0 nova_compute[187243]: 2025-12-02 23:57:42.424 187247 DEBUG nova.virt.libvirt.volume.remotefs [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Creating file /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/7896edb7eaa6449a9b6addabef8618f4.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Dec 02 23:57:42 compute-0 nova_compute[187243]: 2025-12-02 23:57:42.424 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/7896edb7eaa6449a9b6addabef8618f4.tmp execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:42 compute-0 nova_compute[187243]: 2025-12-02 23:57:42.961 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/7896edb7eaa6449a9b6addabef8618f4.tmp" returned: 1 in 0.537s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:42 compute-0 nova_compute[187243]: 2025-12-02 23:57:42.962 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/7896edb7eaa6449a9b6addabef8618f4.tmp' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Dec 02 23:57:42 compute-0 nova_compute[187243]: 2025-12-02 23:57:42.962 187247 DEBUG nova.virt.libvirt.volume.remotefs [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Creating directory /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c on remote host 192.168.122.101 create_dir /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Dec 02 23:57:42 compute-0 nova_compute[187243]: 2025-12-02 23:57:42.962 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:43 compute-0 podman[212255]: 2025-12-02 23:57:43.122247513 +0000 UTC m=+0.066831330 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41)
Dec 02 23:57:43 compute-0 nova_compute[187243]: 2025-12-02 23:57:43.182 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c" returned: 0 in 0.219s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:43 compute-0 nova_compute[187243]: 2025-12-02 23:57:43.186 187247 DEBUG nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4247
Dec 02 23:57:44 compute-0 sshd-session[212251]: Connection closed by 45.78.219.95 port 40142 [preauth]
Dec 02 23:57:44 compute-0 nova_compute[187243]: 2025-12-02 23:57:44.698 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:44 compute-0 nova_compute[187243]: 2025-12-02 23:57:44.886 187247 DEBUG nova.compute.manager [req-8521ec21-9227-4fe3-aa4f-7449cddb2c96 req-18ec3874-f818-4dbe-be53-4313eec1eedb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:44 compute-0 nova_compute[187243]: 2025-12-02 23:57:44.887 187247 DEBUG oslo_concurrency.lockutils [req-8521ec21-9227-4fe3-aa4f-7449cddb2c96 req-18ec3874-f818-4dbe-be53-4313eec1eedb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:44 compute-0 nova_compute[187243]: 2025-12-02 23:57:44.888 187247 DEBUG oslo_concurrency.lockutils [req-8521ec21-9227-4fe3-aa4f-7449cddb2c96 req-18ec3874-f818-4dbe-be53-4313eec1eedb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:44 compute-0 nova_compute[187243]: 2025-12-02 23:57:44.888 187247 DEBUG oslo_concurrency.lockutils [req-8521ec21-9227-4fe3-aa4f-7449cddb2c96 req-18ec3874-f818-4dbe-be53-4313eec1eedb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:44 compute-0 nova_compute[187243]: 2025-12-02 23:57:44.888 187247 DEBUG nova.compute.manager [req-8521ec21-9227-4fe3-aa4f-7449cddb2c96 req-18ec3874-f818-4dbe-be53-4313eec1eedb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] No event matching network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb in dict_keys([('network-vif-plugged', 'aa1a4037-7471-48e2-8297-5aeb45672ebb')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 02 23:57:44 compute-0 nova_compute[187243]: 2025-12-02 23:57:44.888 187247 DEBUG nova.compute.manager [req-8521ec21-9227-4fe3-aa4f-7449cddb2c96 req-18ec3874-f818-4dbe-be53-4313eec1eedb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:57:45 compute-0 kernel: tap933e46ed-57 (unregistering): left promiscuous mode
Dec 02 23:57:45 compute-0 NetworkManager[55671]: <info>  [1764719865.3478] device (tap933e46ed-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 02 23:57:45 compute-0 ovn_controller[95488]: 2025-12-02T23:57:45Z|00076|binding|INFO|Releasing lport 933e46ed-57a7-472a-adf9-eff09ae7c559 from this chassis (sb_readonly=0)
Dec 02 23:57:45 compute-0 nova_compute[187243]: 2025-12-02 23:57:45.358 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:45 compute-0 ovn_controller[95488]: 2025-12-02T23:57:45Z|00077|binding|INFO|Setting lport 933e46ed-57a7-472a-adf9-eff09ae7c559 down in Southbound
Dec 02 23:57:45 compute-0 ovn_controller[95488]: 2025-12-02T23:57:45Z|00078|binding|INFO|Removing iface tap933e46ed-57 ovn-installed in OVS
Dec 02 23:57:45 compute-0 nova_compute[187243]: 2025-12-02 23:57:45.360 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.365 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:bc:aa 10.100.0.9'], port_security=['fa:16:3e:2d:bc:aa 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '35a3db0d-2b6a-47be-bc85-4b164026935c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=933e46ed-57a7-472a-adf9-eff09ae7c559) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.367 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 933e46ed-57a7-472a-adf9-eff09ae7c559 in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a unbound from our chassis
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.368 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec494140-a5f4-4327-8807-d7248b1cdc9a
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.381 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3cf4c4f8-dbe6-4afa-a0d6-6babc9332d79]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:45 compute-0 nova_compute[187243]: 2025-12-02 23:57:45.383 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.423 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[ba074b14-2d9e-4dcc-a4fa-9fa3bb2319a4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.426 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[ceafcbfa-885b-4bad-a022-48ec6360127c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:45 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Deactivated successfully.
Dec 02 23:57:45 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Consumed 14.119s CPU time.
Dec 02 23:57:45 compute-0 systemd-machined[153518]: Machine qemu-5-instance-00000008 terminated.
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.466 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[403ad639-3a8f-470d-a133-41d690a0f83d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.497 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[eda77dbe-ed77-4b34-867f-5383741588c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec494140-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:0f:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385302, 'reachable_time': 32877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212303, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.523 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2422959b-61ed-4e94-9135-7a85a0847d93]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385312, 'tstamp': 385312}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212304, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec494140-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385315, 'tstamp': 385315}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212304, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.525 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:45 compute-0 nova_compute[187243]: 2025-12-02 23:57:45.527 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:45 compute-0 nova_compute[187243]: 2025-12-02 23:57:45.531 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.532 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec494140-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.532 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.533 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec494140-a0, col_values=(('external_ids', {'iface-id': '9ee451cb-cc6e-44d6-98fb-cdfa0566e521'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.533 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.535 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8370d735-e7cb-42a0-997d-f922fe42f3d0]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ec494140-a5f4-4327-8807-d7248b1cdc9a\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ec494140-a5f4-4327-8807-d7248b1cdc9a\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:45 compute-0 nova_compute[187243]: 2025-12-02 23:57:45.567 187247 DEBUG nova.compute.manager [req-08cc2127-8638-4855-932e-23e36f5da58c req-0d5f124e-5683-4296-845b-de1c46e517d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:45 compute-0 nova_compute[187243]: 2025-12-02 23:57:45.568 187247 DEBUG oslo_concurrency.lockutils [req-08cc2127-8638-4855-932e-23e36f5da58c req-0d5f124e-5683-4296-845b-de1c46e517d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:45 compute-0 nova_compute[187243]: 2025-12-02 23:57:45.568 187247 DEBUG oslo_concurrency.lockutils [req-08cc2127-8638-4855-932e-23e36f5da58c req-0d5f124e-5683-4296-845b-de1c46e517d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:45 compute-0 nova_compute[187243]: 2025-12-02 23:57:45.569 187247 DEBUG oslo_concurrency.lockutils [req-08cc2127-8638-4855-932e-23e36f5da58c req-0d5f124e-5683-4296-845b-de1c46e517d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:45 compute-0 nova_compute[187243]: 2025-12-02 23:57:45.569 187247 DEBUG nova.compute.manager [req-08cc2127-8638-4855-932e-23e36f5da58c req-0d5f124e-5683-4296-845b-de1c46e517d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] No waiting events found dispatching network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:57:45 compute-0 nova_compute[187243]: 2025-12-02 23:57:45.570 187247 WARNING nova.compute.manager [req-08cc2127-8638-4855-932e-23e36f5da58c req-0d5f124e-5683-4296-845b-de1c46e517d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received unexpected event network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 for instance with vm_state active and task_state resize_migrating.
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.674 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:57:45 compute-0 nova_compute[187243]: 2025-12-02 23:57:45.674 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:45 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:45.675 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:57:45 compute-0 nova_compute[187243]: 2025-12-02 23:57:45.891 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.201 187247 INFO nova.virt.libvirt.driver [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Instance shutdown successfully after 3 seconds.
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.207 187247 INFO nova.virt.libvirt.driver [-] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Instance destroyed successfully.
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.208 187247 DEBUG nova.virt.libvirt.vif [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-02T23:56:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-508367334',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-508367334',id=8,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:56:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-q02f1mi0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:57:32Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=35a3db0d-2b6a-47be-bc85-4b164026935c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:2d:bc:aa"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.208 187247 DEBUG nova.network.os_vif_util [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "vif_mac": "fa:16:3e:2d:bc:aa"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.208 187247 DEBUG nova.network.os_vif_util [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.209 187247 DEBUG os_vif [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.211 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.211 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap933e46ed-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.212 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.214 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.214 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.215 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=bd6e2a54-7c5f-4436-965b-ad5a9707e345) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.215 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.217 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.220 187247 INFO os_vif [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57')
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.225 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:46 compute-0 sshd-session[212290]: Received disconnect from 61.220.235.10 port 39162:11: Bye Bye [preauth]
Dec 02 23:57:46 compute-0 sshd-session[212290]: Disconnected from authenticating user root 61.220.235.10 port 39162 [preauth]
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.314 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.315 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.379 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.381 187247 DEBUG nova.virt.libvirt.volume.remotefs [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Copying file /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c_resize/disk to 192.168.122.101:/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.381 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c_resize/disk 192.168.122.101:/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:46 compute-0 nova_compute[187243]: 2025-12-02 23:57:46.714 187247 INFO nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Took 8.02 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.041 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "scp -r /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c_resize/disk 192.168.122.101:/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk" returned: 0 in 0.660s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.042 187247 DEBUG nova.virt.libvirt.volume.remotefs [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Copying file /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.config copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.042 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c_resize/disk.config 192.168.122.101:/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.config execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.196 187247 DEBUG nova.compute.manager [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.197 187247 DEBUG oslo_concurrency.lockutils [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.197 187247 DEBUG oslo_concurrency.lockutils [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.197 187247 DEBUG oslo_concurrency.lockutils [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.197 187247 DEBUG nova.compute.manager [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Processing event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.198 187247 DEBUG nova.compute.manager [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-changed-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.198 187247 DEBUG nova.compute.manager [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Refreshing instance network info cache due to event network-changed-aa1a4037-7471-48e2-8297-5aeb45672ebb. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.198 187247 DEBUG oslo_concurrency.lockutils [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.198 187247 DEBUG oslo_concurrency.lockutils [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.199 187247 DEBUG nova.network.neutron [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Refreshing network info cache for port aa1a4037-7471-48e2-8297-5aeb45672ebb _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.200 187247 DEBUG nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.301 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "scp -C -r /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c_resize/disk.config 192.168.122.101:/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.config" returned: 0 in 0.259s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.302 187247 DEBUG nova.virt.libvirt.volume.remotefs [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Copying file /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.info copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.302 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c_resize/disk.info 192.168.122.101:/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.info execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.552 187247 DEBUG oslo_concurrency.processutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "scp -C -r /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c_resize/disk.info 192.168.122.101:/var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk.info" returned: 0 in 0.250s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.554 187247 WARNING neutronclient.v2_0.client [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.555 187247 WARNING neutronclient.v2_0.client [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.712 187247 DEBUG nova.compute.manager [req-8d1c219e-52ea-48e4-852e-516b6fe40743 req-293e809d-ead7-4c07-8edd-9d4c02fd9b6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.713 187247 DEBUG oslo_concurrency.lockutils [req-8d1c219e-52ea-48e4-852e-516b6fe40743 req-293e809d-ead7-4c07-8edd-9d4c02fd9b6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.714 187247 DEBUG oslo_concurrency.lockutils [req-8d1c219e-52ea-48e4-852e-516b6fe40743 req-293e809d-ead7-4c07-8edd-9d4c02fd9b6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.714 187247 DEBUG oslo_concurrency.lockutils [req-8d1c219e-52ea-48e4-852e-516b6fe40743 req-293e809d-ead7-4c07-8edd-9d4c02fd9b6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.715 187247 DEBUG nova.compute.manager [req-8d1c219e-52ea-48e4-852e-516b6fe40743 req-293e809d-ead7-4c07-8edd-9d4c02fd9b6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] No waiting events found dispatching network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.715 187247 WARNING nova.compute.manager [req-8d1c219e-52ea-48e4-852e-516b6fe40743 req-293e809d-ead7-4c07-8edd-9d4c02fd9b6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received unexpected event network-vif-unplugged-933e46ed-57a7-472a-adf9-eff09ae7c559 for instance with vm_state active and task_state resize_migrating.
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.790 187247 DEBUG nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpz8owyb_p',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d66a42a4-6bab-485d-a45f-0df43bf25d1b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(eb5aabe1-cb88-4102-8f99-b6fe8e8e8562),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.792 187247 WARNING neutronclient.v2_0.client [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:47 compute-0 nova_compute[187243]: 2025-12-02 23:57:47.921 187247 DEBUG neutronclient.v2_0.client [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 933e46ed-57a7-472a-adf9-eff09ae7c559 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.12/site-packages/neutronclient/v2_0/client.py:265
Dec 02 23:57:48 compute-0 podman[212336]: 2025-12-02 23:57:48.135501549 +0000 UTC m=+0.090267485 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.267 187247 WARNING neutronclient.v2_0.client [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.312 187247 DEBUG nova.objects.instance [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid d66a42a4-6bab-485d-a45f-0df43bf25d1b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.314 187247 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.316 187247 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.317 187247 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.479 187247 DEBUG nova.network.neutron [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Updated VIF entry in instance network info cache for port aa1a4037-7471-48e2-8297-5aeb45672ebb. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.480 187247 DEBUG nova.network.neutron [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Updating instance_info_cache with network_info: [{"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.818 187247 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.819 187247 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.828 187247 DEBUG nova.virt.libvirt.vif [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-02T23:55:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-116577734',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-116577734',id=6,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:56:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-dt7jcyvd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:56:04Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=d66a42a4-6bab-485d-a45f-0df43bf25d1b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.828 187247 DEBUG nova.network.os_vif_util [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.829 187247 DEBUG nova.network.os_vif_util [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.829 187247 DEBUG nova.virt.libvirt.migration [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Updating guest XML with vif config: <interface type="ethernet">
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:fd:d7:48"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <target dev="tapaa1a4037-74"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]: </interface>
Dec 02 23:57:48 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.829 187247 DEBUG nova.virt.libvirt.migration [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <name>instance-00000006</name>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <uuid>d66a42a4-6bab-485d-a45f-0df43bf25d1b</uuid>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <metadata>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-116577734</nova:name>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-02 23:55:58</nova:creationTime>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 02 23:57:48 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:properties>
Dec 02 23:57:48 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         </nova:properties>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </nova:image>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:owner>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:user uuid="d31b8a74cb3c48f3b147970eec936bca">tempest-TestExecuteActionsViaActuator-1889160444-project-admin</nova:user>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:project uuid="5f2368878ee9447ea8fcef9927711e2d">tempest-TestExecuteActionsViaActuator-1889160444</nova:project>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </nova:owner>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:ports>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:port uuid="aa1a4037-7471-48e2-8297-5aeb45672ebb">
Dec 02 23:57:48 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         </nova:port>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </nova:ports>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </nova:instance>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </metadata>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <resource>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </resource>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <system>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="serial">d66a42a4-6bab-485d-a45f-0df43bf25d1b</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="uuid">d66a42a4-6bab-485d-a45f-0df43bf25d1b</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </system>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </sysinfo>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <os>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </os>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <features>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <acpi/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <apic/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </features>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </cpu>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </clock>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <devices>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk.config"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <readonly/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:fd:d7:48"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaa1a4037-74"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/console.log" append="off"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </target>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </serial>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <console type="pty">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/console.log" append="off"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </console>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </input>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </graphics>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <video>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </video>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </memballoon>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </rng>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </devices>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]: </domain>
Dec 02 23:57:48 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.830 187247 DEBUG nova.virt.libvirt.migration [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <name>instance-00000006</name>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <uuid>d66a42a4-6bab-485d-a45f-0df43bf25d1b</uuid>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <metadata>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-116577734</nova:name>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-02 23:55:58</nova:creationTime>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 02 23:57:48 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:properties>
Dec 02 23:57:48 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         </nova:properties>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </nova:image>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:owner>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:user uuid="d31b8a74cb3c48f3b147970eec936bca">tempest-TestExecuteActionsViaActuator-1889160444-project-admin</nova:user>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:project uuid="5f2368878ee9447ea8fcef9927711e2d">tempest-TestExecuteActionsViaActuator-1889160444</nova:project>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </nova:owner>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:ports>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:port uuid="aa1a4037-7471-48e2-8297-5aeb45672ebb">
Dec 02 23:57:48 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         </nova:port>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </nova:ports>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </nova:instance>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </metadata>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <resource>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </resource>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <system>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="serial">d66a42a4-6bab-485d-a45f-0df43bf25d1b</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="uuid">d66a42a4-6bab-485d-a45f-0df43bf25d1b</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </system>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </sysinfo>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <os>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </os>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <features>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <acpi/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <apic/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </features>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </cpu>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </clock>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <devices>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk.config"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <readonly/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:fd:d7:48"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaa1a4037-74"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/console.log" append="off"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </target>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </serial>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <console type="pty">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/console.log" append="off"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </console>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </input>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </graphics>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <video>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </video>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </memballoon>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </rng>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </devices>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]: </domain>
Dec 02 23:57:48 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.830 187247 DEBUG nova.virt.libvirt.migration [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <name>instance-00000006</name>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <uuid>d66a42a4-6bab-485d-a45f-0df43bf25d1b</uuid>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <metadata>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-116577734</nova:name>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-02 23:55:58</nova:creationTime>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 02 23:57:48 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:properties>
Dec 02 23:57:48 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         </nova:properties>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </nova:image>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:owner>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:user uuid="d31b8a74cb3c48f3b147970eec936bca">tempest-TestExecuteActionsViaActuator-1889160444-project-admin</nova:user>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:project uuid="5f2368878ee9447ea8fcef9927711e2d">tempest-TestExecuteActionsViaActuator-1889160444</nova:project>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </nova:owner>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <nova:ports>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <nova:port uuid="aa1a4037-7471-48e2-8297-5aeb45672ebb">
Dec 02 23:57:48 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:         </nova:port>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </nova:ports>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </nova:instance>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </metadata>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <resource>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </resource>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <system>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="serial">d66a42a4-6bab-485d-a45f-0df43bf25d1b</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="uuid">d66a42a4-6bab-485d-a45f-0df43bf25d1b</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </system>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </sysinfo>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <os>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </os>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <features>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <acpi/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <apic/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </features>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </cpu>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </clock>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <devices>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/disk.config"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <readonly/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </controller>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:fd:d7:48"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaa1a4037-74"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/console.log" append="off"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 02 23:57:48 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       </target>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </serial>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <console type="pty">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b/console.log" append="off"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </console>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </input>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </graphics>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <video>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </video>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </memballoon>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:57:48 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]:     </rng>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   </devices>
Dec 02 23:57:48 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 02 23:57:48 compute-0 nova_compute[187243]: </domain>
Dec 02 23:57:48 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.830 187247 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.966 187247 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.966 187247 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.966 187247 DEBUG oslo_concurrency.lockutils [None req-6b95f48c-f5a4-4680-9d25-5b29b3d314c6 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:48 compute-0 nova_compute[187243]: 2025-12-02 23:57:48.985 187247 DEBUG oslo_concurrency.lockutils [req-72046207-f427-4a16-bbbb-09a81d3c9b52 req-00375d0b-d865-491d-9324-0dc0b1844148 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-d66a42a4-6bab-485d-a45f-0df43bf25d1b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:57:49 compute-0 nova_compute[187243]: 2025-12-02 23:57:49.321 187247 DEBUG nova.virt.libvirt.migration [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 02 23:57:49 compute-0 nova_compute[187243]: 2025-12-02 23:57:49.322 187247 INFO nova.virt.libvirt.migration [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 02 23:57:50 compute-0 sshd-session[212185]: Received disconnect from 45.78.222.160 port 55840:11: Bye Bye [preauth]
Dec 02 23:57:50 compute-0 sshd-session[212185]: Disconnected from authenticating user root 45.78.222.160 port 55840 [preauth]
Dec 02 23:57:50 compute-0 nova_compute[187243]: 2025-12-02 23:57:50.343 187247 INFO nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 02 23:57:50 compute-0 nova_compute[187243]: 2025-12-02 23:57:50.847 187247 DEBUG nova.virt.libvirt.migration [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 02 23:57:50 compute-0 nova_compute[187243]: 2025-12-02 23:57:50.847 187247 DEBUG nova.virt.libvirt.migration [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 02 23:57:50 compute-0 nova_compute[187243]: 2025-12-02 23:57:50.892 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.216 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.350 187247 DEBUG nova.virt.libvirt.migration [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.350 187247 DEBUG nova.virt.libvirt.migration [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 02 23:57:51 compute-0 kernel: tapaa1a4037-74 (unregistering): left promiscuous mode
Dec 02 23:57:51 compute-0 NetworkManager[55671]: <info>  [1764719871.4936] device (tapaa1a4037-74): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.502 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:51 compute-0 ovn_controller[95488]: 2025-12-02T23:57:51Z|00079|binding|INFO|Releasing lport aa1a4037-7471-48e2-8297-5aeb45672ebb from this chassis (sb_readonly=0)
Dec 02 23:57:51 compute-0 ovn_controller[95488]: 2025-12-02T23:57:51Z|00080|binding|INFO|Setting lport aa1a4037-7471-48e2-8297-5aeb45672ebb down in Southbound
Dec 02 23:57:51 compute-0 ovn_controller[95488]: 2025-12-02T23:57:51Z|00081|binding|INFO|Removing iface tapaa1a4037-74 ovn-installed in OVS
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.518 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:d7:48 10.100.0.12'], port_security=['fa:16:3e:fd:d7:48 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd66a42a4-6bab-485d-a45f-0df43bf25d1b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f2368878ee9447ea8fcef9927711e2d', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'dbe9f254-ff27-47a4-8a8a-2e008ed5fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dafccd5a-1323-4463-a427-917b548298c4, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=aa1a4037-7471-48e2-8297-5aeb45672ebb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.520 104379 INFO neutron.agent.ovn.metadata.agent [-] Port aa1a4037-7471-48e2-8297-5aeb45672ebb in datapath ec494140-a5f4-4327-8807-d7248b1cdc9a unbound from our chassis
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.523 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec494140-a5f4-4327-8807-d7248b1cdc9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.524 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b449a887-68cf-4b55-893e-fc8783f1f291]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.525 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a namespace which is not needed anymore
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.529 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:51 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Deactivated successfully.
Dec 02 23:57:51 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Consumed 18.361s CPU time.
Dec 02 23:57:51 compute-0 systemd-machined[153518]: Machine qemu-4-instance-00000006 terminated.
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.690 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:51 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211760]: [NOTICE]   (211770) : haproxy version is 3.0.5-8e879a5
Dec 02 23:57:51 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211760]: [NOTICE]   (211770) : path to executable is /usr/sbin/haproxy
Dec 02 23:57:51 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211760]: [WARNING]  (211770) : Exiting Master process...
Dec 02 23:57:51 compute-0 podman[212402]: 2025-12-02 23:57:51.692830021 +0000 UTC m=+0.052350115 container kill cbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Dec 02 23:57:51 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211760]: [ALERT]    (211770) : Current worker (211772) exited with code 143 (Terminated)
Dec 02 23:57:51 compute-0 neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a[211760]: [WARNING]  (211770) : All workers exited. Exiting... (0)
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.697 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:51 compute-0 systemd[1]: libpod-cbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131.scope: Deactivated successfully.
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.741 187247 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.741 187247 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.742 187247 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 02 23:57:51 compute-0 podman[212423]: 2025-12-02 23:57:51.76576586 +0000 UTC m=+0.043137279 container died cbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 02 23:57:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131-userdata-shm.mount: Deactivated successfully.
Dec 02 23:57:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-96cf13fd13da20b3e5390d38a40da03343604dce6347c3b582255b3ea079878c-merged.mount: Deactivated successfully.
Dec 02 23:57:51 compute-0 podman[212423]: 2025-12-02 23:57:51.806726555 +0000 UTC m=+0.084097934 container cleanup cbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest)
Dec 02 23:57:51 compute-0 systemd[1]: libpod-conmon-cbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131.scope: Deactivated successfully.
Dec 02 23:57:51 compute-0 podman[212430]: 2025-12-02 23:57:51.828696613 +0000 UTC m=+0.088411009 container remove cbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.835 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f9480776-7a0d-4213-a470-f25c7441071b]: (4, ("Tue Dec  2 11:57:51 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a (cbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131)\ncbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131\nTue Dec  2 11:57:51 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a (cbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131)\ncbab8e33444c4ebe3b4a0945ee092155d70382b41a45b0a2c4ca534ccef75131\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.837 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c5520c7e-392b-430d-9e71-9e51a2a7e374]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.837 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec494140-a5f4-4327-8807-d7248b1cdc9a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.838 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3b44af03-08f3-4261-9f20-b7e2e12f1efc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.838 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec494140-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.841 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:51 compute-0 kernel: tapec494140-a0: left promiscuous mode
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.852 187247 DEBUG nova.virt.libvirt.guest [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'd66a42a4-6bab-485d-a45f-0df43bf25d1b' (instance-00000006) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.853 187247 INFO nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Migration operation has completed
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.853 187247 INFO nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] _post_live_migration() is started..
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.862 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.863 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.865 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c01f5a87-a5a5-4344-828f-9420d88109c6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.870 187247 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:51 compute-0 nova_compute[187243]: 2025-12-02 23:57:51.870 187247 WARNING neutronclient.v2_0.client [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.885 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8254d651-0308-41f8-8eb3-cee423e1b2d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.885 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6a48b336-ec8a-4e56-aef2-4e1a7ea84c3f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.905 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[826e581e-569b-467a-92b6-1ec153eff691]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385295, 'reachable_time': 42560, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212467, 'error': None, 'target': 'ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.907 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ec494140-a5f4-4327-8807-d7248b1cdc9a deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 02 23:57:51 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:51.907 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[e73257fe-cd17-4433-8296-9e26f4c3c0c9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:57:51 compute-0 systemd[1]: run-netns-ovnmeta\x2dec494140\x2da5f4\x2d4327\x2d8807\x2dd7248b1cdc9a.mount: Deactivated successfully.
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.231 187247 DEBUG nova.compute.manager [req-0575ca10-a77f-4829-bb00-df8c0005b148 req-d377fddf-cc31-4d1d-b3f7-98ad649df9d6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.231 187247 DEBUG oslo_concurrency.lockutils [req-0575ca10-a77f-4829-bb00-df8c0005b148 req-d377fddf-cc31-4d1d-b3f7-98ad649df9d6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.231 187247 DEBUG oslo_concurrency.lockutils [req-0575ca10-a77f-4829-bb00-df8c0005b148 req-d377fddf-cc31-4d1d-b3f7-98ad649df9d6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.231 187247 DEBUG oslo_concurrency.lockutils [req-0575ca10-a77f-4829-bb00-df8c0005b148 req-d377fddf-cc31-4d1d-b3f7-98ad649df9d6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.232 187247 DEBUG nova.compute.manager [req-0575ca10-a77f-4829-bb00-df8c0005b148 req-d377fddf-cc31-4d1d-b3f7-98ad649df9d6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] No waiting events found dispatching network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.232 187247 DEBUG nova.compute.manager [req-0575ca10-a77f-4829-bb00-df8c0005b148 req-d377fddf-cc31-4d1d-b3f7-98ad649df9d6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.314 187247 DEBUG nova.compute.manager [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-changed-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.314 187247 DEBUG nova.compute.manager [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Refreshing instance network info cache due to event network-changed-933e46ed-57a7-472a-adf9-eff09ae7c559. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.314 187247 DEBUG oslo_concurrency.lockutils [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.314 187247 DEBUG oslo_concurrency.lockutils [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.315 187247 DEBUG nova.network.neutron [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Refreshing network info cache for port 933e46ed-57a7-472a-adf9-eff09ae7c559 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.507 187247 DEBUG nova.network.neutron [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port aa1a4037-7471-48e2-8297-5aeb45672ebb and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.508 187247 DEBUG nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.510 187247 DEBUG nova.virt.libvirt.vif [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-02T23:55:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-116577734',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-116577734',id=6,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:56:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-dt7jcyvd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:57:29Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=d66a42a4-6bab-485d-a45f-0df43bf25d1b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.510 187247 DEBUG nova.network.os_vif_util [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "address": "fa:16:3e:fd:d7:48", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa1a4037-74", "ovs_interfaceid": "aa1a4037-7471-48e2-8297-5aeb45672ebb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.512 187247 DEBUG nova.network.os_vif_util [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.513 187247 DEBUG os_vif [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.516 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.517 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa1a4037-74, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.558 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.560 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.561 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.562 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=ffdd2017-1a07-4d10-ada3-3b9ccda9c5a1) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.562 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.564 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.566 187247 INFO os_vif [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:d7:48,bridge_name='br-int',has_traffic_filtering=True,id=aa1a4037-7471-48e2-8297-5aeb45672ebb,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa1a4037-74')
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.566 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.566 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.567 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.567 187247 DEBUG nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.567 187247 INFO nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Deleting instance files /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b_del
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.568 187247 INFO nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Deletion of /var/lib/nova/instances/d66a42a4-6bab-485d-a45f-0df43bf25d1b_del complete
Dec 02 23:57:52 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:57:52.676 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:57:52 compute-0 nova_compute[187243]: 2025-12-02 23:57:52.820 187247 WARNING neutronclient.v2_0.client [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:53 compute-0 nova_compute[187243]: 2025-12-02 23:57:53.277 187247 WARNING neutronclient.v2_0.client [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:57:53 compute-0 nova_compute[187243]: 2025-12-02 23:57:53.486 187247 DEBUG nova.network.neutron [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updated VIF entry in instance network info cache for port 933e46ed-57a7-472a-adf9-eff09ae7c559. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 02 23:57:53 compute-0 nova_compute[187243]: 2025-12-02 23:57:53.487 187247 DEBUG nova.network.neutron [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating instance_info_cache with network_info: [{"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:57:53 compute-0 nova_compute[187243]: 2025-12-02 23:57:53.995 187247 DEBUG oslo_concurrency.lockutils [req-b80f3396-a1ea-4749-afae-158f070a79c7 req-bc36acb1-6d9e-4ef2-885c-a67e81287de0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.321 187247 DEBUG nova.compute.manager [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.321 187247 DEBUG oslo_concurrency.lockutils [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.322 187247 DEBUG oslo_concurrency.lockutils [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.322 187247 DEBUG oslo_concurrency.lockutils [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.322 187247 DEBUG nova.compute.manager [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] No waiting events found dispatching network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.323 187247 WARNING nova.compute.manager [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received unexpected event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb for instance with vm_state active and task_state migrating.
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.323 187247 DEBUG nova.compute.manager [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.324 187247 DEBUG oslo_concurrency.lockutils [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.324 187247 DEBUG oslo_concurrency.lockutils [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.324 187247 DEBUG oslo_concurrency.lockutils [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.325 187247 DEBUG nova.compute.manager [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] No waiting events found dispatching network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.325 187247 DEBUG nova.compute.manager [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.325 187247 DEBUG nova.compute.manager [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.326 187247 DEBUG oslo_concurrency.lockutils [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.326 187247 DEBUG oslo_concurrency.lockutils [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.327 187247 DEBUG oslo_concurrency.lockutils [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.327 187247 DEBUG nova.compute.manager [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] No waiting events found dispatching network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.327 187247 WARNING nova.compute.manager [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received unexpected event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb for instance with vm_state active and task_state migrating.
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.328 187247 DEBUG nova.compute.manager [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.328 187247 DEBUG oslo_concurrency.lockutils [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.328 187247 DEBUG oslo_concurrency.lockutils [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.329 187247 DEBUG oslo_concurrency.lockutils [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.329 187247 DEBUG nova.compute.manager [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] No waiting events found dispatching network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.330 187247 WARNING nova.compute.manager [req-ebc5d3ba-387f-4022-800d-b4585e1fe2a7 req-455ba53d-95c1-471e-b0e6-4f45f7c17145 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received unexpected event network-vif-plugged-aa1a4037-7471-48e2-8297-5aeb45672ebb for instance with vm_state active and task_state migrating.
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.394 187247 DEBUG nova.compute.manager [req-d8663919-019c-4d61-b39d-da69cb4261f2 req-a3fe4da9-5a0b-465f-8bfa-9bffc9d8fb05 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.394 187247 DEBUG oslo_concurrency.lockutils [req-d8663919-019c-4d61-b39d-da69cb4261f2 req-a3fe4da9-5a0b-465f-8bfa-9bffc9d8fb05 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.395 187247 DEBUG oslo_concurrency.lockutils [req-d8663919-019c-4d61-b39d-da69cb4261f2 req-a3fe4da9-5a0b-465f-8bfa-9bffc9d8fb05 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.395 187247 DEBUG oslo_concurrency.lockutils [req-d8663919-019c-4d61-b39d-da69cb4261f2 req-a3fe4da9-5a0b-465f-8bfa-9bffc9d8fb05 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.395 187247 DEBUG nova.compute.manager [req-d8663919-019c-4d61-b39d-da69cb4261f2 req-a3fe4da9-5a0b-465f-8bfa-9bffc9d8fb05 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] No waiting events found dispatching network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:57:54 compute-0 nova_compute[187243]: 2025-12-02 23:57:54.396 187247 DEBUG nova.compute.manager [req-d8663919-019c-4d61-b39d-da69cb4261f2 req-a3fe4da9-5a0b-465f-8bfa-9bffc9d8fb05 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Received event network-vif-unplugged-aa1a4037-7471-48e2-8297-5aeb45672ebb for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 02 23:57:55 compute-0 nova_compute[187243]: 2025-12-02 23:57:55.894 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:57 compute-0 podman[212468]: 2025-12-02 23:57:57.103246569 +0000 UTC m=+0.057232845 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 02 23:57:57 compute-0 nova_compute[187243]: 2025-12-02 23:57:57.563 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:57:58 compute-0 nova_compute[187243]: 2025-12-02 23:57:58.355 187247 DEBUG nova.compute.manager [req-48c81127-40d9-4909-a8c7-901eff78e27d req-d95c2221-f216-4bcb-9711-7ce7c9d0f9d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:57:58 compute-0 nova_compute[187243]: 2025-12-02 23:57:58.356 187247 DEBUG oslo_concurrency.lockutils [req-48c81127-40d9-4909-a8c7-901eff78e27d req-d95c2221-f216-4bcb-9711-7ce7c9d0f9d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:57:58 compute-0 nova_compute[187243]: 2025-12-02 23:57:58.356 187247 DEBUG oslo_concurrency.lockutils [req-48c81127-40d9-4909-a8c7-901eff78e27d req-d95c2221-f216-4bcb-9711-7ce7c9d0f9d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:57:58 compute-0 nova_compute[187243]: 2025-12-02 23:57:58.357 187247 DEBUG oslo_concurrency.lockutils [req-48c81127-40d9-4909-a8c7-901eff78e27d req-d95c2221-f216-4bcb-9711-7ce7c9d0f9d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:57:58 compute-0 nova_compute[187243]: 2025-12-02 23:57:58.357 187247 DEBUG nova.compute.manager [req-48c81127-40d9-4909-a8c7-901eff78e27d req-d95c2221-f216-4bcb-9711-7ce7c9d0f9d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] No waiting events found dispatching network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:57:58 compute-0 nova_compute[187243]: 2025-12-02 23:57:58.357 187247 WARNING nova.compute.manager [req-48c81127-40d9-4909-a8c7-901eff78e27d req-d95c2221-f216-4bcb-9711-7ce7c9d0f9d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received unexpected event network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 for instance with vm_state active and task_state resize_finish.
Dec 02 23:57:59 compute-0 podman[197600]: time="2025-12-02T23:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:57:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:57:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Dec 02 23:58:00 compute-0 nova_compute[187243]: 2025-12-02 23:58:00.428 187247 DEBUG nova.compute.manager [req-822ccfe8-fe6f-48f1-9c36-6ee918b0a604 req-e25ba2fa-d139-4c94-af12-4e84e5adcc2b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received event network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:58:00 compute-0 nova_compute[187243]: 2025-12-02 23:58:00.429 187247 DEBUG oslo_concurrency.lockutils [req-822ccfe8-fe6f-48f1-9c36-6ee918b0a604 req-e25ba2fa-d139-4c94-af12-4e84e5adcc2b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:00 compute-0 nova_compute[187243]: 2025-12-02 23:58:00.429 187247 DEBUG oslo_concurrency.lockutils [req-822ccfe8-fe6f-48f1-9c36-6ee918b0a604 req-e25ba2fa-d139-4c94-af12-4e84e5adcc2b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:00 compute-0 nova_compute[187243]: 2025-12-02 23:58:00.430 187247 DEBUG oslo_concurrency.lockutils [req-822ccfe8-fe6f-48f1-9c36-6ee918b0a604 req-e25ba2fa-d139-4c94-af12-4e84e5adcc2b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:00 compute-0 nova_compute[187243]: 2025-12-02 23:58:00.430 187247 DEBUG nova.compute.manager [req-822ccfe8-fe6f-48f1-9c36-6ee918b0a604 req-e25ba2fa-d139-4c94-af12-4e84e5adcc2b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] No waiting events found dispatching network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 02 23:58:00 compute-0 nova_compute[187243]: 2025-12-02 23:58:00.431 187247 WARNING nova.compute.manager [req-822ccfe8-fe6f-48f1-9c36-6ee918b0a604 req-e25ba2fa-d139-4c94-af12-4e84e5adcc2b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Received unexpected event network-vif-plugged-933e46ed-57a7-472a-adf9-eff09ae7c559 for instance with vm_state resized and task_state None.
Dec 02 23:58:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:58:00.681 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:58:00.682 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:58:00.682 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:00 compute-0 nova_compute[187243]: 2025-12-02 23:58:00.935 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:01 compute-0 podman[212493]: 2025-12-02 23:58:01.155027646 +0000 UTC m=+0.096713953 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Dec 02 23:58:01 compute-0 podman[212494]: 2025-12-02 23:58:01.199585339 +0000 UTC m=+0.134285904 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 02 23:58:01 compute-0 openstack_network_exporter[199746]: ERROR   23:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:58:01 compute-0 openstack_network_exporter[199746]: ERROR   23:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:58:01 compute-0 openstack_network_exporter[199746]: ERROR   23:58:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:58:01 compute-0 openstack_network_exporter[199746]: ERROR   23:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:58:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:58:01 compute-0 openstack_network_exporter[199746]: ERROR   23:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:58:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:58:01 compute-0 anacron[7485]: Job `cron.weekly' started
Dec 02 23:58:01 compute-0 anacron[7485]: Job `cron.weekly' terminated
Dec 02 23:58:02 compute-0 nova_compute[187243]: 2025-12-02 23:58:02.105 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:02 compute-0 nova_compute[187243]: 2025-12-02 23:58:02.106 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:02 compute-0 nova_compute[187243]: 2025-12-02 23:58:02.106 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "d66a42a4-6bab-485d-a45f-0df43bf25d1b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:02 compute-0 nova_compute[187243]: 2025-12-02 23:58:02.565 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:02 compute-0 nova_compute[187243]: 2025-12-02 23:58:02.628 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:02 compute-0 nova_compute[187243]: 2025-12-02 23:58:02.628 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:02 compute-0 nova_compute[187243]: 2025-12-02 23:58:02.629 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:02 compute-0 nova_compute[187243]: 2025-12-02 23:58:02.629 187247 DEBUG nova.compute.resource_tracker [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:58:03 compute-0 nova_compute[187243]: 2025-12-02 23:58:03.670 187247 WARNING nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Periodic task is updating the host stats, it is trying to get disk info for instance-00000008, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/35a3db0d-2b6a-47be-bc85-4b164026935c/disk
Dec 02 23:58:03 compute-0 nova_compute[187243]: 2025-12-02 23:58:03.831 187247 WARNING nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:58:03 compute-0 nova_compute[187243]: 2025-12-02 23:58:03.832 187247 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:58:03 compute-0 nova_compute[187243]: 2025-12-02 23:58:03.869 187247 DEBUG oslo_concurrency.processutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:58:03 compute-0 nova_compute[187243]: 2025-12-02 23:58:03.870 187247 DEBUG nova.compute.resource_tracker [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5815MB free_disk=73.13711166381836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:58:03 compute-0 nova_compute[187243]: 2025-12-02 23:58:03.870 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:03 compute-0 nova_compute[187243]: 2025-12-02 23:58:03.870 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:04 compute-0 nova_compute[187243]: 2025-12-02 23:58:04.902 187247 DEBUG nova.compute.resource_tracker [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance d66a42a4-6bab-485d-a45f-0df43bf25d1b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 02 23:58:04 compute-0 nova_compute[187243]: 2025-12-02 23:58:04.902 187247 DEBUG nova.compute.resource_tracker [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance 35a3db0d-2b6a-47be-bc85-4b164026935c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 02 23:58:05 compute-0 nova_compute[187243]: 2025-12-02 23:58:05.410 187247 DEBUG nova.compute.resource_tracker [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 02 23:58:05 compute-0 nova_compute[187243]: 2025-12-02 23:58:05.861 187247 DEBUG oslo_concurrency.lockutils [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "35a3db0d-2b6a-47be-bc85-4b164026935c" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:05 compute-0 nova_compute[187243]: 2025-12-02 23:58:05.862 187247 DEBUG oslo_concurrency.lockutils [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:05 compute-0 nova_compute[187243]: 2025-12-02 23:58:05.862 187247 DEBUG nova.compute.manager [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Going to confirm migration 3 do_confirm_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:5287
Dec 02 23:58:05 compute-0 nova_compute[187243]: 2025-12-02 23:58:05.918 187247 INFO nova.compute.resource_tracker [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating resource usage from migration c53386d3-3d97-4c78-a0cf-66ce9d67e567
Dec 02 23:58:05 compute-0 nova_compute[187243]: 2025-12-02 23:58:05.919 187247 DEBUG nova.compute.resource_tracker [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Starting to track outgoing migration c53386d3-3d97-4c78-a0cf-66ce9d67e567 with flavor b2669e62-ef04-4b34-b3d6-69efcfbafbdc _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1549
Dec 02 23:58:05 compute-0 nova_compute[187243]: 2025-12-02 23:58:05.950 187247 DEBUG nova.compute.resource_tracker [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration eb5aabe1-cb88-4102-8f99-b6fe8e8e8562 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 02 23:58:05 compute-0 nova_compute[187243]: 2025-12-02 23:58:05.950 187247 DEBUG nova.compute.resource_tracker [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration c53386d3-3d97-4c78-a0cf-66ce9d67e567 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 02 23:58:05 compute-0 nova_compute[187243]: 2025-12-02 23:58:05.951 187247 DEBUG nova.compute.resource_tracker [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:58:05 compute-0 nova_compute[187243]: 2025-12-02 23:58:05.951 187247 DEBUG nova.compute.resource_tracker [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:58:03 up  1:06,  0 user,  load average: 0.34, 0.39, 0.45\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:58:05 compute-0 nova_compute[187243]: 2025-12-02 23:58:05.969 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:06 compute-0 nova_compute[187243]: 2025-12-02 23:58:06.013 187247 DEBUG nova.compute.provider_tree [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:58:06 compute-0 nova_compute[187243]: 2025-12-02 23:58:06.380 187247 DEBUG nova.objects.instance [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'info_cache' on Instance uuid 35a3db0d-2b6a-47be-bc85-4b164026935c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:58:06 compute-0 nova_compute[187243]: 2025-12-02 23:58:06.520 187247 DEBUG nova.scheduler.client.report [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:58:06 compute-0 nova_compute[187243]: 2025-12-02 23:58:06.906 187247 WARNING neutronclient.v2_0.client [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:07 compute-0 nova_compute[187243]: 2025-12-02 23:58:07.042 187247 DEBUG nova.compute.resource_tracker [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:58:07 compute-0 nova_compute[187243]: 2025-12-02 23:58:07.043 187247 DEBUG oslo_concurrency.lockutils [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.172s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:07 compute-0 nova_compute[187243]: 2025-12-02 23:58:07.057 187247 INFO nova.compute.manager [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 02 23:58:07 compute-0 nova_compute[187243]: 2025-12-02 23:58:07.196 187247 WARNING neutronclient.v2_0.client [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:07 compute-0 nova_compute[187243]: 2025-12-02 23:58:07.196 187247 WARNING neutronclient.v2_0.client [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:07 compute-0 nova_compute[187243]: 2025-12-02 23:58:07.398 187247 DEBUG neutronclient.v2_0.client [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 933e46ed-57a7-472a-adf9-eff09ae7c559 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.12/site-packages/neutronclient/v2_0/client.py:265
Dec 02 23:58:07 compute-0 nova_compute[187243]: 2025-12-02 23:58:07.399 187247 DEBUG oslo_concurrency.lockutils [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:58:07 compute-0 nova_compute[187243]: 2025-12-02 23:58:07.399 187247 DEBUG oslo_concurrency.lockutils [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:58:07 compute-0 nova_compute[187243]: 2025-12-02 23:58:07.399 187247 DEBUG nova.network.neutron [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:58:07 compute-0 nova_compute[187243]: 2025-12-02 23:58:07.567 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:07 compute-0 nova_compute[187243]: 2025-12-02 23:58:07.905 187247 WARNING neutronclient.v2_0.client [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:08 compute-0 nova_compute[187243]: 2025-12-02 23:58:08.135 187247 INFO nova.scheduler.client.report [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration eb5aabe1-cb88-4102-8f99-b6fe8e8e8562
Dec 02 23:58:08 compute-0 nova_compute[187243]: 2025-12-02 23:58:08.135 187247 DEBUG nova.virt.libvirt.driver [None req-ca2de752-5165-430f-b3d2-029dea9ed96e 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: d66a42a4-6bab-485d-a45f-0df43bf25d1b] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 02 23:58:08 compute-0 nova_compute[187243]: 2025-12-02 23:58:08.528 187247 WARNING neutronclient.v2_0.client [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:58:08 compute-0 nova_compute[187243]: 2025-12-02 23:58:08.718 187247 DEBUG nova.network.neutron [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 35a3db0d-2b6a-47be-bc85-4b164026935c] Updating instance_info_cache with network_info: [{"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:58:09 compute-0 nova_compute[187243]: 2025-12-02 23:58:09.224 187247 DEBUG oslo_concurrency.lockutils [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-35a3db0d-2b6a-47be-bc85-4b164026935c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:58:09 compute-0 nova_compute[187243]: 2025-12-02 23:58:09.225 187247 DEBUG nova.objects.instance [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 35a3db0d-2b6a-47be-bc85-4b164026935c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:58:09 compute-0 nova_compute[187243]: 2025-12-02 23:58:09.734 187247 DEBUG nova.objects.base [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<35a3db0d-2b6a-47be-bc85-4b164026935c> lazy-loaded attributes: info_cache,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 02 23:58:09 compute-0 nova_compute[187243]: 2025-12-02 23:58:09.750 187247 DEBUG nova.virt.libvirt.vif [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-02T23:56:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-508367334',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-508367334',id=8,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:57:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5f2368878ee9447ea8fcef9927711e2d',ramdisk_id='',reservation_id='r-q02f1mi0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1889160444',owner_user_name='tempest-TestExecuteActionsViaActuator-1889160444-project-admin'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:57:59Z,user_data=None,user_id='d31b8a74cb3c48f3b147970eec936bca',uuid=35a3db0d-2b6a-47be-bc85-4b164026935c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 02 23:58:09 compute-0 nova_compute[187243]: 2025-12-02 23:58:09.751 187247 DEBUG nova.network.os_vif_util [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "933e46ed-57a7-472a-adf9-eff09ae7c559", "address": "fa:16:3e:2d:bc:aa", "network": {"id": "ec494140-a5f4-4327-8807-d7248b1cdc9a", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1938172404-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ba5fccf757b4adaa08907c11ae17f57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap933e46ed-57", "ovs_interfaceid": "933e46ed-57a7-472a-adf9-eff09ae7c559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:58:09 compute-0 nova_compute[187243]: 2025-12-02 23:58:09.752 187247 DEBUG nova.network.os_vif_util [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:58:09 compute-0 nova_compute[187243]: 2025-12-02 23:58:09.753 187247 DEBUG os_vif [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 02 23:58:09 compute-0 nova_compute[187243]: 2025-12-02 23:58:09.756 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:09 compute-0 nova_compute[187243]: 2025-12-02 23:58:09.756 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap933e46ed-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:09 compute-0 nova_compute[187243]: 2025-12-02 23:58:09.757 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:58:09 compute-0 nova_compute[187243]: 2025-12-02 23:58:09.759 187247 INFO os_vif [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:bc:aa,bridge_name='br-int',has_traffic_filtering=True,id=933e46ed-57a7-472a-adf9-eff09ae7c559,network=Network(ec494140-a5f4-4327-8807-d7248b1cdc9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap933e46ed-57')
Dec 02 23:58:09 compute-0 nova_compute[187243]: 2025-12-02 23:58:09.759 187247 DEBUG oslo_concurrency.lockutils [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:09 compute-0 nova_compute[187243]: 2025-12-02 23:58:09.760 187247 DEBUG oslo_concurrency.lockutils [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:10 compute-0 nova_compute[187243]: 2025-12-02 23:58:10.314 187247 DEBUG nova.compute.provider_tree [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:58:10 compute-0 nova_compute[187243]: 2025-12-02 23:58:10.825 187247 DEBUG nova.scheduler.client.report [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:58:10 compute-0 nova_compute[187243]: 2025-12-02 23:58:10.971 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:11 compute-0 nova_compute[187243]: 2025-12-02 23:58:11.859 187247 DEBUG oslo_concurrency.lockutils [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 2.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:12 compute-0 nova_compute[187243]: 2025-12-02 23:58:12.486 187247 INFO nova.scheduler.client.report [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration c53386d3-3d97-4c78-a0cf-66ce9d67e567
Dec 02 23:58:12 compute-0 nova_compute[187243]: 2025-12-02 23:58:12.569 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:12 compute-0 sshd-session[212536]: Received disconnect from 102.210.148.92 port 47106:11: Bye Bye [preauth]
Dec 02 23:58:12 compute-0 sshd-session[212536]: Disconnected from authenticating user root 102.210.148.92 port 47106 [preauth]
Dec 02 23:58:13 compute-0 nova_compute[187243]: 2025-12-02 23:58:13.000 187247 DEBUG oslo_concurrency.lockutils [None req-6110f999-2d7c-461e-bf47-606dc736a444 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "35a3db0d-2b6a-47be-bc85-4b164026935c" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 7.138s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:14 compute-0 podman[212538]: 2025-12-02 23:58:14.094486038 +0000 UTC m=+0.051669978 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, architecture=x86_64, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 02 23:58:16 compute-0 nova_compute[187243]: 2025-12-02 23:58:16.012 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:16 compute-0 sshd-session[212559]: Invalid user builder from 23.95.37.90 port 39424
Dec 02 23:58:16 compute-0 sshd-session[212559]: Received disconnect from 23.95.37.90 port 39424:11: Bye Bye [preauth]
Dec 02 23:58:16 compute-0 sshd-session[212559]: Disconnected from invalid user builder 23.95.37.90 port 39424 [preauth]
Dec 02 23:58:17 compute-0 nova_compute[187243]: 2025-12-02 23:58:17.570 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:19 compute-0 podman[212561]: 2025-12-02 23:58:19.11810998 +0000 UTC m=+0.064204235 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 23:58:21 compute-0 nova_compute[187243]: 2025-12-02 23:58:21.062 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:21 compute-0 sshd-session[212582]: Invalid user es from 49.247.36.49 port 6304
Dec 02 23:58:21 compute-0 sshd-session[212582]: Received disconnect from 49.247.36.49 port 6304:11: Bye Bye [preauth]
Dec 02 23:58:21 compute-0 sshd-session[212582]: Disconnected from invalid user es 49.247.36.49 port 6304 [preauth]
Dec 02 23:58:22 compute-0 nova_compute[187243]: 2025-12-02 23:58:22.573 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:23 compute-0 sshd-session[212585]: Invalid user root1 from 20.123.120.169 port 36868
Dec 02 23:58:23 compute-0 sshd-session[212585]: Received disconnect from 20.123.120.169 port 36868:11: Bye Bye [preauth]
Dec 02 23:58:23 compute-0 sshd-session[212585]: Disconnected from invalid user root1 20.123.120.169 port 36868 [preauth]
Dec 02 23:58:26 compute-0 nova_compute[187243]: 2025-12-02 23:58:26.110 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:27 compute-0 nova_compute[187243]: 2025-12-02 23:58:27.575 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:28 compute-0 podman[212587]: 2025-12-02 23:58:28.127719757 +0000 UTC m=+0.083185591 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:58:29 compute-0 podman[197600]: time="2025-12-02T23:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:58:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:58:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Dec 02 23:58:31 compute-0 nova_compute[187243]: 2025-12-02 23:58:31.152 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:31 compute-0 openstack_network_exporter[199746]: ERROR   23:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:58:31 compute-0 openstack_network_exporter[199746]: ERROR   23:58:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:58:31 compute-0 openstack_network_exporter[199746]: ERROR   23:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:58:31 compute-0 openstack_network_exporter[199746]: ERROR   23:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:58:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:58:31 compute-0 openstack_network_exporter[199746]: ERROR   23:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:58:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:58:32 compute-0 podman[212611]: 2025-12-02 23:58:32.096992012 +0000 UTC m=+0.057473811 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 02 23:58:32 compute-0 podman[212612]: 2025-12-02 23:58:32.171362366 +0000 UTC m=+0.116488798 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Dec 02 23:58:32 compute-0 nova_compute[187243]: 2025-12-02 23:58:32.577 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:32 compute-0 nova_compute[187243]: 2025-12-02 23:58:32.925 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:32 compute-0 nova_compute[187243]: 2025-12-02 23:58:32.926 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:32 compute-0 nova_compute[187243]: 2025-12-02 23:58:32.926 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:32 compute-0 nova_compute[187243]: 2025-12-02 23:58:32.926 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:58:33 compute-0 nova_compute[187243]: 2025-12-02 23:58:33.589 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:33 compute-0 nova_compute[187243]: 2025-12-02 23:58:33.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:34 compute-0 nova_compute[187243]: 2025-12-02 23:58:34.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:34 compute-0 nova_compute[187243]: 2025-12-02 23:58:34.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:35 compute-0 nova_compute[187243]: 2025-12-02 23:58:35.114 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:35 compute-0 nova_compute[187243]: 2025-12-02 23:58:35.115 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:35 compute-0 nova_compute[187243]: 2025-12-02 23:58:35.115 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:35 compute-0 nova_compute[187243]: 2025-12-02 23:58:35.116 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:58:35 compute-0 nova_compute[187243]: 2025-12-02 23:58:35.329 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:58:35 compute-0 nova_compute[187243]: 2025-12-02 23:58:35.331 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:58:35 compute-0 nova_compute[187243]: 2025-12-02 23:58:35.367 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:58:35 compute-0 nova_compute[187243]: 2025-12-02 23:58:35.368 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5852MB free_disk=73.16595840454102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:58:35 compute-0 nova_compute[187243]: 2025-12-02 23:58:35.368 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:58:35 compute-0 nova_compute[187243]: 2025-12-02 23:58:35.369 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:58:36 compute-0 nova_compute[187243]: 2025-12-02 23:58:36.156 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:36 compute-0 nova_compute[187243]: 2025-12-02 23:58:36.494 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:58:36 compute-0 nova_compute[187243]: 2025-12-02 23:58:36.495 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:58:35 up  1:06,  0 user,  load average: 0.19, 0.35, 0.43\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:58:36 compute-0 nova_compute[187243]: 2025-12-02 23:58:36.582 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:58:37 compute-0 nova_compute[187243]: 2025-12-02 23:58:37.093 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:58:37 compute-0 nova_compute[187243]: 2025-12-02 23:58:37.578 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:37 compute-0 nova_compute[187243]: 2025-12-02 23:58:37.601 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:58:37 compute-0 nova_compute[187243]: 2025-12-02 23:58:37.602 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.232s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:58:38 compute-0 nova_compute[187243]: 2025-12-02 23:58:38.601 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:41 compute-0 nova_compute[187243]: 2025-12-02 23:58:41.157 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:41 compute-0 nova_compute[187243]: 2025-12-02 23:58:41.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:58:42 compute-0 nova_compute[187243]: 2025-12-02 23:58:42.580 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:44 compute-0 sshd-session[212658]: Invalid user janice from 45.78.219.213 port 48984
Dec 02 23:58:44 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:58:44.202 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:58:44 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:58:44.203 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:58:44 compute-0 nova_compute[187243]: 2025-12-02 23:58:44.204 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:44 compute-0 podman[212660]: 2025-12-02 23:58:44.205236291 +0000 UTC m=+0.072979991 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container)
Dec 02 23:58:45 compute-0 sshd-session[212658]: Received disconnect from 45.78.219.213 port 48984:11: Bye Bye [preauth]
Dec 02 23:58:45 compute-0 sshd-session[212658]: Disconnected from invalid user janice 45.78.219.213 port 48984 [preauth]
Dec 02 23:58:46 compute-0 nova_compute[187243]: 2025-12-02 23:58:46.159 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:47 compute-0 nova_compute[187243]: 2025-12-02 23:58:47.582 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:50 compute-0 podman[212684]: 2025-12-02 23:58:50.17392181 +0000 UTC m=+0.119927292 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd)
Dec 02 23:58:51 compute-0 nova_compute[187243]: 2025-12-02 23:58:51.196 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:52 compute-0 nova_compute[187243]: 2025-12-02 23:58:52.584 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:53 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:58:53.204 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:58:56 compute-0 nova_compute[187243]: 2025-12-02 23:58:56.204 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:57 compute-0 nova_compute[187243]: 2025-12-02 23:58:57.586 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:58:59 compute-0 podman[212705]: 2025-12-02 23:58:59.124818568 +0000 UTC m=+0.071081454 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 02 23:58:59 compute-0 sshd-session[212682]: Connection closed by 101.47.140.127 port 44418 [preauth]
Dec 02 23:58:59 compute-0 podman[197600]: time="2025-12-02T23:58:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:58:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:58:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:58:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:58:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Dec 02 23:59:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:00.683 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:00.683 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:00 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:00.683 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:01 compute-0 nova_compute[187243]: 2025-12-02 23:59:01.245 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:01 compute-0 openstack_network_exporter[199746]: ERROR   23:59:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:59:01 compute-0 openstack_network_exporter[199746]: ERROR   23:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:59:01 compute-0 openstack_network_exporter[199746]: ERROR   23:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:59:01 compute-0 openstack_network_exporter[199746]: ERROR   23:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:59:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:59:01 compute-0 openstack_network_exporter[199746]: ERROR   23:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:59:01 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:59:02 compute-0 nova_compute[187243]: 2025-12-02 23:59:02.587 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:03 compute-0 podman[212731]: 2025-12-02 23:59:03.134639136 +0000 UTC m=+0.084905023 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Dec 02 23:59:03 compute-0 podman[212732]: 2025-12-02 23:59:03.159537267 +0000 UTC m=+0.106531944 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Dec 02 23:59:06 compute-0 nova_compute[187243]: 2025-12-02 23:59:06.245 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:07 compute-0 nova_compute[187243]: 2025-12-02 23:59:07.590 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:11 compute-0 nova_compute[187243]: 2025-12-02 23:59:11.247 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:11 compute-0 nova_compute[187243]: 2025-12-02 23:59:11.441 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:12 compute-0 nova_compute[187243]: 2025-12-02 23:59:12.591 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:15 compute-0 podman[212772]: 2025-12-02 23:59:15.176088659 +0000 UTC m=+0.092798177 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, distribution-scope=public, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 02 23:59:16 compute-0 sshd-session[212771]: Invalid user dd from 61.220.235.10 port 38318
Dec 02 23:59:16 compute-0 sshd-session[212771]: Received disconnect from 61.220.235.10 port 38318:11: Bye Bye [preauth]
Dec 02 23:59:16 compute-0 sshd-session[212771]: Disconnected from invalid user dd 61.220.235.10 port 38318 [preauth]
Dec 02 23:59:16 compute-0 nova_compute[187243]: 2025-12-02 23:59:16.301 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:17 compute-0 nova_compute[187243]: 2025-12-02 23:59:17.593 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:21 compute-0 podman[212794]: 2025-12-02 23:59:21.132528717 +0000 UTC m=+0.074949159 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 02 23:59:21 compute-0 nova_compute[187243]: 2025-12-02 23:59:21.333 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:22 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:22.239 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:31:60 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86bd5114f990455bad9eb03145bbd520', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e22f98a-28c1-406a-8582-57ed07fee88b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1e62127c-f508-4e9e-bb5e-b8835c45c013) old=Port_Binding(mac=['fa:16:3e:b1:31:60'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86bd5114f990455bad9eb03145bbd520', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:59:22 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:22.241 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1e62127c-f508-4e9e-bb5e-b8835c45c013 in datapath 1c53a3e7-267c-42d7-8662-f773adcc4604 updated
Dec 02 23:59:22 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:22.242 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1c53a3e7-267c-42d7-8662-f773adcc4604, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:59:22 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:22.243 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b2701522-c26a-4c6b-ac2d-6d8a8852a7c9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:22 compute-0 nova_compute[187243]: 2025-12-02 23:59:22.594 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:24 compute-0 sshd-session[212815]: Invalid user blue from 102.210.148.92 port 50876
Dec 02 23:59:24 compute-0 sshd-session[212815]: Received disconnect from 102.210.148.92 port 50876:11: Bye Bye [preauth]
Dec 02 23:59:24 compute-0 sshd-session[212815]: Disconnected from invalid user blue 102.210.148.92 port 50876 [preauth]
Dec 02 23:59:26 compute-0 nova_compute[187243]: 2025-12-02 23:59:26.370 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:27 compute-0 nova_compute[187243]: 2025-12-02 23:59:27.617 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:29 compute-0 nova_compute[187243]: 2025-12-02 23:59:29.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:29 compute-0 podman[197600]: time="2025-12-02T23:59:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:59:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:59:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 02 23:59:29 compute-0 podman[197600]: @ - - [02/Dec/2025:23:59:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec 02 23:59:30 compute-0 podman[212817]: 2025-12-02 23:59:30.108456765 +0000 UTC m=+0.059463642 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 02 23:59:30 compute-0 nova_compute[187243]: 2025-12-02 23:59:30.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:30 compute-0 nova_compute[187243]: 2025-12-02 23:59:30.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 02 23:59:31 compute-0 nova_compute[187243]: 2025-12-02 23:59:31.373 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:31 compute-0 openstack_network_exporter[199746]: ERROR   23:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:59:31 compute-0 openstack_network_exporter[199746]: ERROR   23:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 02 23:59:31 compute-0 openstack_network_exporter[199746]: ERROR   23:59:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 02 23:59:31 compute-0 openstack_network_exporter[199746]: ERROR   23:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 02 23:59:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:59:31 compute-0 openstack_network_exporter[199746]: ERROR   23:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 02 23:59:31 compute-0 openstack_network_exporter[199746]: 
Dec 02 23:59:31 compute-0 nova_compute[187243]: 2025-12-02 23:59:31.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:32 compute-0 nova_compute[187243]: 2025-12-02 23:59:32.663 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:33 compute-0 nova_compute[187243]: 2025-12-02 23:59:33.587 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:34 compute-0 podman[212841]: 2025-12-02 23:59:34.136707249 +0000 UTC m=+0.086265273 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4)
Dec 02 23:59:34 compute-0 podman[212842]: 2025-12-02 23:59:34.191684366 +0000 UTC m=+0.133990876 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 02 23:59:34 compute-0 nova_compute[187243]: 2025-12-02 23:59:34.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:35 compute-0 nova_compute[187243]: 2025-12-02 23:59:35.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:35 compute-0 nova_compute[187243]: 2025-12-02 23:59:35.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:36 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:36.351 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:b3:30 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bf2d65cb-48b1-4884-a049-c1d6f7a31df9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bf2d65cb-48b1-4884-a049-c1d6f7a31df9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '916fb9304c874baa83b72f5956839b66', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a5c8896b-cfca-4a55-9039-47650a4a166a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1a892beb-44d7-43a2-b31f-e3508e05fb34) old=Port_Binding(mac=['fa:16:3e:0f:b3:30'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-bf2d65cb-48b1-4884-a049-c1d6f7a31df9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bf2d65cb-48b1-4884-a049-c1d6f7a31df9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '916fb9304c874baa83b72f5956839b66', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:59:36 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:36.352 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1a892beb-44d7-43a2-b31f-e3508e05fb34 in datapath bf2d65cb-48b1-4884-a049-c1d6f7a31df9 updated
Dec 02 23:59:36 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:36.354 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bf2d65cb-48b1-4884-a049-c1d6f7a31df9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 02 23:59:36 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:36.355 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d29a8774-f2a5-490b-852a-154f6f086f9a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:36 compute-0 nova_compute[187243]: 2025-12-02 23:59:36.375 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:36 compute-0 nova_compute[187243]: 2025-12-02 23:59:36.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 02 23:59:37 compute-0 nova_compute[187243]: 2025-12-02 23:59:37.110 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:37 compute-0 nova_compute[187243]: 2025-12-02 23:59:37.111 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:37 compute-0 nova_compute[187243]: 2025-12-02 23:59:37.111 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:37 compute-0 nova_compute[187243]: 2025-12-02 23:59:37.112 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 02 23:59:37 compute-0 nova_compute[187243]: 2025-12-02 23:59:37.301 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:59:37 compute-0 nova_compute[187243]: 2025-12-02 23:59:37.302 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:59:37 compute-0 nova_compute[187243]: 2025-12-02 23:59:37.322 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:59:37 compute-0 nova_compute[187243]: 2025-12-02 23:59:37.323 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5848MB free_disk=73.16595458984375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 02 23:59:37 compute-0 nova_compute[187243]: 2025-12-02 23:59:37.323 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:37 compute-0 nova_compute[187243]: 2025-12-02 23:59:37.323 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:37 compute-0 sshd-session[212888]: Invalid user odin from 23.95.37.90 port 57284
Dec 02 23:59:37 compute-0 nova_compute[187243]: 2025-12-02 23:59:37.664 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:37 compute-0 sshd-session[212888]: Received disconnect from 23.95.37.90 port 57284:11: Bye Bye [preauth]
Dec 02 23:59:37 compute-0 sshd-session[212888]: Disconnected from invalid user odin 23.95.37.90 port 57284 [preauth]
Dec 02 23:59:38 compute-0 nova_compute[187243]: 2025-12-02 23:59:38.377 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 02 23:59:38 compute-0 nova_compute[187243]: 2025-12-02 23:59:38.378 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 23:59:37 up  1:07,  0 user,  load average: 0.26, 0.33, 0.42\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 02 23:59:38 compute-0 nova_compute[187243]: 2025-12-02 23:59:38.573 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:59:39 compute-0 nova_compute[187243]: 2025-12-02 23:59:39.082 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:59:39 compute-0 nova_compute[187243]: 2025-12-02 23:59:39.591 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 02 23:59:39 compute-0 nova_compute[187243]: 2025-12-02 23:59:39.591 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.268s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:41 compute-0 nova_compute[187243]: 2025-12-02 23:59:41.377 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:42 compute-0 nova_compute[187243]: 2025-12-02 23:59:42.710 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:44 compute-0 ovn_controller[95488]: 2025-12-02T23:59:44Z|00082|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 02 23:59:46 compute-0 podman[212890]: 2025-12-02 23:59:46.119742648 +0000 UTC m=+0.073739175 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal)
Dec 02 23:59:46 compute-0 nova_compute[187243]: 2025-12-02 23:59:46.251 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "f2790458-4312-4441-8f6b-e679cabd98c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:46 compute-0 nova_compute[187243]: 2025-12-02 23:59:46.251 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:46 compute-0 nova_compute[187243]: 2025-12-02 23:59:46.379 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:46 compute-0 nova_compute[187243]: 2025-12-02 23:59:46.757 187247 DEBUG nova.compute.manager [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 02 23:59:47 compute-0 nova_compute[187243]: 2025-12-02 23:59:47.321 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:47 compute-0 nova_compute[187243]: 2025-12-02 23:59:47.322 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:47 compute-0 nova_compute[187243]: 2025-12-02 23:59:47.330 187247 DEBUG nova.virt.hardware [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 02 23:59:47 compute-0 nova_compute[187243]: 2025-12-02 23:59:47.330 187247 INFO nova.compute.claims [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Claim successful on node compute-0.ctlplane.example.com
Dec 02 23:59:47 compute-0 nova_compute[187243]: 2025-12-02 23:59:47.747 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:48 compute-0 nova_compute[187243]: 2025-12-02 23:59:48.393 187247 DEBUG nova.compute.provider_tree [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 02 23:59:48 compute-0 nova_compute[187243]: 2025-12-02 23:59:48.904 187247 DEBUG nova.scheduler.client.report [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 02 23:59:49 compute-0 nova_compute[187243]: 2025-12-02 23:59:49.414 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.092s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:49 compute-0 nova_compute[187243]: 2025-12-02 23:59:49.414 187247 DEBUG nova.compute.manager [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 02 23:59:49 compute-0 nova_compute[187243]: 2025-12-02 23:59:49.928 187247 DEBUG nova.compute.manager [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 02 23:59:49 compute-0 nova_compute[187243]: 2025-12-02 23:59:49.928 187247 DEBUG nova.network.neutron [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 02 23:59:49 compute-0 nova_compute[187243]: 2025-12-02 23:59:49.928 187247 WARNING neutronclient.v2_0.client [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:59:49 compute-0 nova_compute[187243]: 2025-12-02 23:59:49.929 187247 WARNING neutronclient.v2_0.client [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:59:50 compute-0 nova_compute[187243]: 2025-12-02 23:59:50.438 187247 INFO nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 02 23:59:50 compute-0 nova_compute[187243]: 2025-12-02 23:59:50.628 187247 DEBUG nova.network.neutron [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Successfully created port: 010323f8-551b-4929-b180-ea6b100e6d9c _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 02 23:59:50 compute-0 nova_compute[187243]: 2025-12-02 23:59:50.945 187247 DEBUG nova.compute.manager [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 02 23:59:51 compute-0 sshd-session[212911]: Invalid user thomas from 49.247.36.49 port 39855
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.320 187247 DEBUG nova.network.neutron [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Successfully updated port: 010323f8-551b-4929-b180-ea6b100e6d9c _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.381 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:51 compute-0 podman[212913]: 2025-12-02 23:59:51.386942927 +0000 UTC m=+0.064667224 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.415 187247 DEBUG nova.compute.manager [req-2b43d13e-f2eb-4dc6-a6c8-32be6ab65d4d req-fceb45bb-6814-4301-b1e4-7001b46ef1b8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Received event network-changed-010323f8-551b-4929-b180-ea6b100e6d9c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.416 187247 DEBUG nova.compute.manager [req-2b43d13e-f2eb-4dc6-a6c8-32be6ab65d4d req-fceb45bb-6814-4301-b1e4-7001b46ef1b8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Refreshing instance network info cache due to event network-changed-010323f8-551b-4929-b180-ea6b100e6d9c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.416 187247 DEBUG oslo_concurrency.lockutils [req-2b43d13e-f2eb-4dc6-a6c8-32be6ab65d4d req-fceb45bb-6814-4301-b1e4-7001b46ef1b8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-f2790458-4312-4441-8f6b-e679cabd98c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.417 187247 DEBUG oslo_concurrency.lockutils [req-2b43d13e-f2eb-4dc6-a6c8-32be6ab65d4d req-fceb45bb-6814-4301-b1e4-7001b46ef1b8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-f2790458-4312-4441-8f6b-e679cabd98c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.417 187247 DEBUG nova.network.neutron [req-2b43d13e-f2eb-4dc6-a6c8-32be6ab65d4d req-fceb45bb-6814-4301-b1e4-7001b46ef1b8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Refreshing network info cache for port 010323f8-551b-4929-b180-ea6b100e6d9c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 02 23:59:51 compute-0 sshd-session[212911]: Received disconnect from 49.247.36.49 port 39855:11: Bye Bye [preauth]
Dec 02 23:59:51 compute-0 sshd-session[212911]: Disconnected from invalid user thomas 49.247.36.49 port 39855 [preauth]
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.828 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "refresh_cache-f2790458-4312-4441-8f6b-e679cabd98c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.924 187247 WARNING neutronclient.v2_0.client [req-2b43d13e-f2eb-4dc6-a6c8-32be6ab65d4d req-fceb45bb-6814-4301-b1e4-7001b46ef1b8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.961 187247 DEBUG nova.compute.manager [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.963 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.963 187247 INFO nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Creating image(s)
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.964 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "/var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.964 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "/var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.965 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "/var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.966 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.970 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:59:51 compute-0 nova_compute[187243]: 2025-12-02 23:59:51.972 187247 DEBUG oslo_concurrency.processutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.005 187247 DEBUG nova.network.neutron [req-2b43d13e-f2eb-4dc6-a6c8-32be6ab65d4d req-fceb45bb-6814-4301-b1e4-7001b46ef1b8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.052 187247 DEBUG oslo_concurrency.processutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.053 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.053 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.053 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.056 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.056 187247 DEBUG oslo_concurrency.processutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.118 187247 DEBUG oslo_concurrency.processutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.119 187247 DEBUG oslo_concurrency.processutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.157 187247 DEBUG oslo_concurrency.processutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.158 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.158 187247 DEBUG oslo_concurrency.processutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.167 187247 DEBUG nova.network.neutron [req-2b43d13e-f2eb-4dc6-a6c8-32be6ab65d4d req-fceb45bb-6814-4301-b1e4-7001b46ef1b8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.206 187247 DEBUG oslo_concurrency.processutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.206 187247 DEBUG nova.virt.disk.api [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Checking if we can resize image /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.207 187247 DEBUG oslo_concurrency.processutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.253 187247 DEBUG oslo_concurrency.processutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.253 187247 DEBUG nova.virt.disk.api [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Cannot resize image /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.254 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.254 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Ensure instance console log exists: /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.254 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.255 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.255 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.677 187247 DEBUG oslo_concurrency.lockutils [req-2b43d13e-f2eb-4dc6-a6c8-32be6ab65d4d req-fceb45bb-6814-4301-b1e4-7001b46ef1b8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-f2790458-4312-4441-8f6b-e679cabd98c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.678 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquired lock "refresh_cache-f2790458-4312-4441-8f6b-e679cabd98c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.678 187247 DEBUG nova.network.neutron [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 02 23:59:52 compute-0 nova_compute[187243]: 2025-12-02 23:59:52.750 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:53 compute-0 nova_compute[187243]: 2025-12-02 23:59:53.304 187247 DEBUG nova.network.neutron [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 02 23:59:53 compute-0 nova_compute[187243]: 2025-12-02 23:59:53.546 187247 WARNING neutronclient.v2_0.client [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:59:53 compute-0 nova_compute[187243]: 2025-12-02 23:59:53.733 187247 DEBUG nova.network.neutron [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Updating instance_info_cache with network_info: [{"id": "010323f8-551b-4929-b180-ea6b100e6d9c", "address": "fa:16:3e:26:03:cc", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap010323f8-55", "ovs_interfaceid": "010323f8-551b-4929-b180-ea6b100e6d9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.241 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Releasing lock "refresh_cache-f2790458-4312-4441-8f6b-e679cabd98c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.241 187247 DEBUG nova.compute.manager [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Instance network_info: |[{"id": "010323f8-551b-4929-b180-ea6b100e6d9c", "address": "fa:16:3e:26:03:cc", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap010323f8-55", "ovs_interfaceid": "010323f8-551b-4929-b180-ea6b100e6d9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.244 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Start _get_guest_xml network_info=[{"id": "010323f8-551b-4929-b180-ea6b100e6d9c", "address": "fa:16:3e:26:03:cc", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap010323f8-55", "ovs_interfaceid": "010323f8-551b-4929-b180-ea6b100e6d9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.248 187247 WARNING nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.249 187247 DEBUG nova.virt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-921027', uuid='f2790458-4312-4441-8f6b-e679cabd98c5'), owner=OwnerMeta(userid='f68e1c374dfc43b8a8431b13bafb13c8', username='tempest-TestExecuteBasicStrategy-436376556-project-admin', projectid='916fb9304c874baa83b72f5956839b66', projectname='tempest-TestExecuteBasicStrategy-436376556'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "010323f8-551b-4929-b180-ea6b100e6d9c", "address": "fa:16:3e:26:03:cc", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap010323f8-55", "ovs_interfaceid": "010323f8-551b-4929-b180-ea6b100e6d9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764719994.2498405) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.255 187247 DEBUG nova.virt.libvirt.host [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.255 187247 DEBUG nova.virt.libvirt.host [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.259 187247 DEBUG nova.virt.libvirt.host [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.259 187247 DEBUG nova.virt.libvirt.host [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.261 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.261 187247 DEBUG nova.virt.hardware [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.262 187247 DEBUG nova.virt.hardware [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.262 187247 DEBUG nova.virt.hardware [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.262 187247 DEBUG nova.virt.hardware [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.263 187247 DEBUG nova.virt.hardware [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.263 187247 DEBUG nova.virt.hardware [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.263 187247 DEBUG nova.virt.hardware [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.264 187247 DEBUG nova.virt.hardware [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.264 187247 DEBUG nova.virt.hardware [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.264 187247 DEBUG nova.virt.hardware [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.264 187247 DEBUG nova.virt.hardware [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.269 187247 DEBUG nova.virt.libvirt.vif [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-02T23:59:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-921027',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-921027',id=10,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='916fb9304c874baa83b72f5956839b66',ramdisk_id='',reservation_id='r-yb3cgrkd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-436376556',owner_user_name='tempest-TestExecuteBasicStrategy-436376556-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:59:50Z,user_data=None,user_id='f68e1c374dfc43b8a8431b13bafb13c8',uuid=f2790458-4312-4441-8f6b-e679cabd98c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "010323f8-551b-4929-b180-ea6b100e6d9c", "address": "fa:16:3e:26:03:cc", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap010323f8-55", "ovs_interfaceid": "010323f8-551b-4929-b180-ea6b100e6d9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.269 187247 DEBUG nova.network.os_vif_util [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Converting VIF {"id": "010323f8-551b-4929-b180-ea6b100e6d9c", "address": "fa:16:3e:26:03:cc", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap010323f8-55", "ovs_interfaceid": "010323f8-551b-4929-b180-ea6b100e6d9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.270 187247 DEBUG nova.network.os_vif_util [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:03:cc,bridge_name='br-int',has_traffic_filtering=True,id=010323f8-551b-4929-b180-ea6b100e6d9c,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap010323f8-55') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.271 187247 DEBUG nova.objects.instance [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lazy-loading 'pci_devices' on Instance uuid f2790458-4312-4441-8f6b-e679cabd98c5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.779 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] End _get_guest_xml xml=<domain type="kvm">
Dec 02 23:59:54 compute-0 nova_compute[187243]:   <uuid>f2790458-4312-4441-8f6b-e679cabd98c5</uuid>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   <name>instance-0000000a</name>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   <metadata>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteBasicStrategy-server-921027</nova:name>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-02 23:59:54</nova:creationTime>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 02 23:59:54 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:properties>
Dec 02 23:59:54 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         </nova:properties>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       </nova:image>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <nova:owner>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:user uuid="f68e1c374dfc43b8a8431b13bafb13c8">tempest-TestExecuteBasicStrategy-436376556-project-admin</nova:user>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:project uuid="916fb9304c874baa83b72f5956839b66">tempest-TestExecuteBasicStrategy-436376556</nova:project>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       </nova:owner>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <nova:ports>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         <nova:port uuid="010323f8-551b-4929-b180-ea6b100e6d9c">
Dec 02 23:59:54 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:         </nova:port>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       </nova:ports>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     </nova:instance>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   </metadata>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <system>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <entry name="serial">f2790458-4312-4441-8f6b-e679cabd98c5</entry>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <entry name="uuid">f2790458-4312-4441-8f6b-e679cabd98c5</entry>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     </system>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   </sysinfo>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   <os>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   </os>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   <features>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <acpi/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <apic/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   </features>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   </clock>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   </cpu>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   <devices>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk.config"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     </disk>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:26:03:cc"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <target dev="tap010323f8-55"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     </interface>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/console.log" append="off"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     </serial>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <video>
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     </video>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     </rng>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 02 23:59:54 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 02 23:59:54 compute-0 nova_compute[187243]:     </memballoon>
Dec 02 23:59:54 compute-0 nova_compute[187243]:   </devices>
Dec 02 23:59:54 compute-0 nova_compute[187243]: </domain>
Dec 02 23:59:54 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.781 187247 DEBUG nova.compute.manager [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Preparing to wait for external event network-vif-plugged-010323f8-551b-4929-b180-ea6b100e6d9c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.781 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.781 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.782 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.782 187247 DEBUG nova.virt.libvirt.vif [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-02T23:59:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-921027',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-921027',id=10,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='916fb9304c874baa83b72f5956839b66',ramdisk_id='',reservation_id='r-yb3cgrkd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-436376556',owner_user_name='tempest-TestExecuteBasicStrategy-436376556-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-02T23:59:50Z,user_data=None,user_id='f68e1c374dfc43b8a8431b13bafb13c8',uuid=f2790458-4312-4441-8f6b-e679cabd98c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "010323f8-551b-4929-b180-ea6b100e6d9c", "address": "fa:16:3e:26:03:cc", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap010323f8-55", "ovs_interfaceid": "010323f8-551b-4929-b180-ea6b100e6d9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.783 187247 DEBUG nova.network.os_vif_util [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Converting VIF {"id": "010323f8-551b-4929-b180-ea6b100e6d9c", "address": "fa:16:3e:26:03:cc", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap010323f8-55", "ovs_interfaceid": "010323f8-551b-4929-b180-ea6b100e6d9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.783 187247 DEBUG nova.network.os_vif_util [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:03:cc,bridge_name='br-int',has_traffic_filtering=True,id=010323f8-551b-4929-b180-ea6b100e6d9c,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap010323f8-55') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.784 187247 DEBUG os_vif [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:03:cc,bridge_name='br-int',has_traffic_filtering=True,id=010323f8-551b-4929-b180-ea6b100e6d9c,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap010323f8-55') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.784 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.785 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.785 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.786 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.786 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '6a930b60-d7ff-5b70-bfd3-83b18ee91d94', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.787 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.789 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.791 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.792 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap010323f8-55, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.792 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap010323f8-55, col_values=(('qos', UUID('9bf59761-7f43-47d1-885d-601b1019de50')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.792 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap010323f8-55, col_values=(('external_ids', {'iface-id': '010323f8-551b-4929-b180-ea6b100e6d9c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:03:cc', 'vm-uuid': 'f2790458-4312-4441-8f6b-e679cabd98c5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.793 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:54 compute-0 NetworkManager[55671]: <info>  [1764719994.7951] manager: (tap010323f8-55): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.795 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.802 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:54 compute-0 nova_compute[187243]: 2025-12-02 23:59:54.802 187247 INFO os_vif [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:03:cc,bridge_name='br-int',has_traffic_filtering=True,id=010323f8-551b-4929-b180-ea6b100e6d9c,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap010323f8-55')
Dec 02 23:59:56 compute-0 sshd-session[212950]: Invalid user david from 20.123.120.169 port 37740
Dec 02 23:59:56 compute-0 sshd-session[212950]: Received disconnect from 20.123.120.169 port 37740:11: Bye Bye [preauth]
Dec 02 23:59:56 compute-0 sshd-session[212950]: Disconnected from invalid user david 20.123.120.169 port 37740 [preauth]
Dec 02 23:59:56 compute-0 nova_compute[187243]: 2025-12-02 23:59:56.353 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:59:56 compute-0 nova_compute[187243]: 2025-12-02 23:59:56.354 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 02 23:59:56 compute-0 nova_compute[187243]: 2025-12-02 23:59:56.354 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] No VIF found with MAC fa:16:3e:26:03:cc, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 02 23:59:56 compute-0 nova_compute[187243]: 2025-12-02 23:59:56.355 187247 INFO nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Using config drive
Dec 02 23:59:56 compute-0 nova_compute[187243]: 2025-12-02 23:59:56.383 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:56 compute-0 nova_compute[187243]: 2025-12-02 23:59:56.869 187247 WARNING neutronclient.v2_0.client [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 02 23:59:57 compute-0 nova_compute[187243]: 2025-12-02 23:59:57.372 187247 INFO nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Creating config drive at /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk.config
Dec 02 23:59:57 compute-0 nova_compute[187243]: 2025-12-02 23:59:57.382 187247 DEBUG oslo_concurrency.processutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpzxz32btz execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 02 23:59:57 compute-0 nova_compute[187243]: 2025-12-02 23:59:57.515 187247 DEBUG oslo_concurrency.processutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpzxz32btz" returned: 0 in 0.133s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 02 23:59:57 compute-0 kernel: tap010323f8-55: entered promiscuous mode
Dec 02 23:59:57 compute-0 NetworkManager[55671]: <info>  [1764719997.5808] manager: (tap010323f8-55): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Dec 02 23:59:57 compute-0 ovn_controller[95488]: 2025-12-02T23:59:57Z|00083|binding|INFO|Claiming lport 010323f8-551b-4929-b180-ea6b100e6d9c for this chassis.
Dec 02 23:59:57 compute-0 nova_compute[187243]: 2025-12-02 23:59:57.587 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:57 compute-0 ovn_controller[95488]: 2025-12-02T23:59:57Z|00084|binding|INFO|010323f8-551b-4929-b180-ea6b100e6d9c: Claiming fa:16:3e:26:03:cc 10.100.0.12
Dec 02 23:59:57 compute-0 nova_compute[187243]: 2025-12-02 23:59:57.596 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.610 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:03:cc 10.100.0.12'], port_security=['fa:16:3e:26:03:cc 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f2790458-4312-4441-8f6b-e679cabd98c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '916fb9304c874baa83b72f5956839b66', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1093c49e-a0ca-44ab-a8bd-3c19ec9553c0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e22f98a-28c1-406a-8582-57ed07fee88b, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=010323f8-551b-4929-b180-ea6b100e6d9c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:59:57 compute-0 systemd-udevd[212969]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.611 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 010323f8-551b-4929-b180-ea6b100e6d9c in datapath 1c53a3e7-267c-42d7-8662-f773adcc4604 bound to our chassis
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.613 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1c53a3e7-267c-42d7-8662-f773adcc4604
Dec 02 23:59:57 compute-0 systemd-machined[153518]: New machine qemu-6-instance-0000000a.
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.630 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8c4d464d-c457-4731-b8c8-fb75aec57072]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.631 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1c53a3e7-21 in ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 02 23:59:57 compute-0 NetworkManager[55671]: <info>  [1764719997.6333] device (tap010323f8-55): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 02 23:59:57 compute-0 NetworkManager[55671]: <info>  [1764719997.6345] device (tap010323f8-55): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.634 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1c53a3e7-20 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.634 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[bd9dd5a6-1e09-44e5-9241-4e791d3b7a14]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.635 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[df604348-9416-47fc-a806-f58a09b2b8b3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-0000000a.
Dec 02 23:59:57 compute-0 ovn_controller[95488]: 2025-12-02T23:59:57Z|00085|binding|INFO|Setting lport 010323f8-551b-4929-b180-ea6b100e6d9c ovn-installed in OVS
Dec 02 23:59:57 compute-0 ovn_controller[95488]: 2025-12-02T23:59:57Z|00086|binding|INFO|Setting lport 010323f8-551b-4929-b180-ea6b100e6d9c up in Southbound
Dec 02 23:59:57 compute-0 nova_compute[187243]: 2025-12-02 23:59:57.654 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.654 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[93974c3b-fc79-40a8-8cbf-a2004e9c6b72]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.671 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a09d44-62b8-4252-a96e-31931efaa3ab]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.716 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[3380116e-8aab-4145-ad94-76595330a015]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.720 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0e65326e-01d5-48d0-8909-5663796ee981]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 systemd-udevd[212973]: Network interface NamePolicy= disabled on kernel command line.
Dec 02 23:59:57 compute-0 NetworkManager[55671]: <info>  [1764719997.7230] manager: (tap1c53a3e7-20): new Veth device (/org/freedesktop/NetworkManager/Devices/40)
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.756 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[6d76adeb-3c22-42c0-ab13-1c02bbb40b95]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.760 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[ab82dad4-d347-4984-a9d6-82c3a2f5123d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 NetworkManager[55671]: <info>  [1764719997.7951] device (tap1c53a3e7-20): carrier: link connected
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.804 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[fb112701-2991-4a81-89eb-2125edf16078]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.827 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8762a710-005e-428c-ab6c-d6924f7b839f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1c53a3e7-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:31:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408843, 'reachable_time': 36523, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213003, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.840 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ee7baf-4fc0-4b45-8e92-d273a42348b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb1:3160'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408843, 'tstamp': 408843}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213004, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.863 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[81e6e39d-5544-4771-a7ff-0f9b0a354035]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1c53a3e7-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:31:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408843, 'reachable_time': 36523, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213005, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.896 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6df5a3-40a8-475f-a4ef-964b413b6ddd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.952 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[cb82f717-f05e-491b-9fdf-6d70a359c6f1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.953 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c53a3e7-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.953 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.953 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c53a3e7-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:59:57 compute-0 nova_compute[187243]: 2025-12-02 23:59:57.955 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:57 compute-0 NetworkManager[55671]: <info>  [1764719997.9558] manager: (tap1c53a3e7-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Dec 02 23:59:57 compute-0 kernel: tap1c53a3e7-20: entered promiscuous mode
Dec 02 23:59:57 compute-0 nova_compute[187243]: 2025-12-02 23:59:57.957 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.958 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1c53a3e7-20, col_values=(('external_ids', {'iface-id': '1e62127c-f508-4e9e-bb5e-b8835c45c013'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 02 23:59:57 compute-0 nova_compute[187243]: 2025-12-02 23:59:57.959 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:57 compute-0 ovn_controller[95488]: 2025-12-02T23:59:57Z|00087|binding|INFO|Releasing lport 1e62127c-f508-4e9e-bb5e-b8835c45c013 from this chassis (sb_readonly=0)
Dec 02 23:59:57 compute-0 nova_compute[187243]: 2025-12-02 23:59:57.976 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.977 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[23a66b63-5936-4d51-a10a-298bd75240de]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.978 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.978 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.978 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 1c53a3e7-267c-42d7-8662-f773adcc4604 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.978 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.978 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[67555c0a-6f02-4989-be2a-ec562c6608b1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.979 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.979 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e86001c0-9fac-4f2d-97bc-87ca598282f8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.980 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: global
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-1c53a3e7-267c-42d7-8662-f773adcc4604
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: defaults
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     log global
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID 1c53a3e7-267c-42d7-8662-f773adcc4604
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 02 23:59:57 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:57.980 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'env', 'PROCESS_TAG=haproxy-1c53a3e7-267c-42d7-8662-f773adcc4604', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1c53a3e7-267c-42d7-8662-f773adcc4604.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.106 187247 DEBUG nova.compute.manager [req-276faa10-539a-4be7-8682-558d548a74ea req-baab864b-9c73-4b26-a404-d202e1368bda 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Received event network-vif-plugged-010323f8-551b-4929-b180-ea6b100e6d9c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.107 187247 DEBUG oslo_concurrency.lockutils [req-276faa10-539a-4be7-8682-558d548a74ea req-baab864b-9c73-4b26-a404-d202e1368bda 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.107 187247 DEBUG oslo_concurrency.lockutils [req-276faa10-539a-4be7-8682-558d548a74ea req-baab864b-9c73-4b26-a404-d202e1368bda 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.107 187247 DEBUG oslo_concurrency.lockutils [req-276faa10-539a-4be7-8682-558d548a74ea req-baab864b-9c73-4b26-a404-d202e1368bda 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.108 187247 DEBUG nova.compute.manager [req-276faa10-539a-4be7-8682-558d548a74ea req-baab864b-9c73-4b26-a404-d202e1368bda 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Processing event network-vif-plugged-010323f8-551b-4929-b180-ea6b100e6d9c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 02 23:59:58 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:58.174 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.175 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.295 187247 DEBUG nova.compute.manager [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.298 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.300 187247 INFO nova.virt.libvirt.driver [-] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Instance spawned successfully.
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.301 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 02 23:59:58 compute-0 podman[213044]: 2025-12-02 23:59:58.341994627 +0000 UTC m=+0.045251151 container create ad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Dec 02 23:59:58 compute-0 systemd[1]: Started libpod-conmon-ad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b.scope.
Dec 02 23:59:58 compute-0 systemd[1]: Started libcrun container.
Dec 02 23:59:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea5e40e13375da801160b3b6f5430466727f2d43404c81353992bf80943af951/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 02 23:59:58 compute-0 podman[213044]: 2025-12-02 23:59:58.319277699 +0000 UTC m=+0.022534243 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 02 23:59:58 compute-0 podman[213044]: 2025-12-02 23:59:58.421807945 +0000 UTC m=+0.125064529 container init ad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Dec 02 23:59:58 compute-0 podman[213044]: 2025-12-02 23:59:58.427753586 +0000 UTC m=+0.131010110 container start ad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 02 23:59:58 compute-0 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[213060]: [NOTICE]   (213064) : New worker (213066) forked
Dec 02 23:59:58 compute-0 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[213060]: [NOTICE]   (213064) : Loading success.
Dec 02 23:59:58 compute-0 ovn_metadata_agent[104374]: 2025-12-02 23:59:58.487 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.812 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.812 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.813 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.813 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.813 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:59:58 compute-0 nova_compute[187243]: 2025-12-02 23:59:58.814 187247 DEBUG nova.virt.libvirt.driver [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 02 23:59:59 compute-0 nova_compute[187243]: 2025-12-02 23:59:59.323 187247 INFO nova.compute.manager [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Took 7.36 seconds to spawn the instance on the hypervisor.
Dec 02 23:59:59 compute-0 nova_compute[187243]: 2025-12-02 23:59:59.325 187247 DEBUG nova.compute.manager [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 02 23:59:59 compute-0 podman[197600]: time="2025-12-02T23:59:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 02 23:59:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:59:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 02 23:59:59 compute-0 podman[197600]: @ - - [02/Dec/2025:23:59:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3052 "" "Go-http-client/1.1"
Dec 02 23:59:59 compute-0 nova_compute[187243]: 2025-12-02 23:59:59.837 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 02 23:59:59 compute-0 nova_compute[187243]: 2025-12-02 23:59:59.864 187247 INFO nova.compute.manager [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Took 12.59 seconds to build instance.
Dec 03 00:00:00 compute-0 nova_compute[187243]: 2025-12-03 00:00:00.299 187247 DEBUG nova.compute.manager [req-be17e066-d6e3-490c-9096-09f193e3dce6 req-bdd808cb-b356-482c-a978-cd9d4682ab4c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Received event network-vif-plugged-010323f8-551b-4929-b180-ea6b100e6d9c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:00:00 compute-0 nova_compute[187243]: 2025-12-03 00:00:00.299 187247 DEBUG oslo_concurrency.lockutils [req-be17e066-d6e3-490c-9096-09f193e3dce6 req-bdd808cb-b356-482c-a978-cd9d4682ab4c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:00 compute-0 nova_compute[187243]: 2025-12-03 00:00:00.299 187247 DEBUG oslo_concurrency.lockutils [req-be17e066-d6e3-490c-9096-09f193e3dce6 req-bdd808cb-b356-482c-a978-cd9d4682ab4c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:00 compute-0 nova_compute[187243]: 2025-12-03 00:00:00.299 187247 DEBUG oslo_concurrency.lockutils [req-be17e066-d6e3-490c-9096-09f193e3dce6 req-bdd808cb-b356-482c-a978-cd9d4682ab4c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:00 compute-0 nova_compute[187243]: 2025-12-03 00:00:00.299 187247 DEBUG nova.compute.manager [req-be17e066-d6e3-490c-9096-09f193e3dce6 req-bdd808cb-b356-482c-a978-cd9d4682ab4c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] No waiting events found dispatching network-vif-plugged-010323f8-551b-4929-b180-ea6b100e6d9c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:00:00 compute-0 nova_compute[187243]: 2025-12-03 00:00:00.300 187247 WARNING nova.compute.manager [req-be17e066-d6e3-490c-9096-09f193e3dce6 req-bdd808cb-b356-482c-a978-cd9d4682ab4c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Received unexpected event network-vif-plugged-010323f8-551b-4929-b180-ea6b100e6d9c for instance with vm_state active and task_state None.
Dec 03 00:00:00 compute-0 nova_compute[187243]: 2025-12-03 00:00:00.369 187247 DEBUG oslo_concurrency.lockutils [None req-049f9b63-d983-4e1f-825a-6dba52f11ba4 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:00:00.684 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:00:00.685 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:00:00.685 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:01 compute-0 systemd[1]: Starting update of the root trust anchor for DNSSEC validation in unbound...
Dec 03 00:00:01 compute-0 systemd[1]: Starting Rotate log files...
Dec 03 00:00:01 compute-0 systemd[1]: unbound-anchor.service: Deactivated successfully.
Dec 03 00:00:01 compute-0 systemd[1]: Finished update of the root trust anchor for DNSSEC validation in unbound.
Dec 03 00:00:01 compute-0 podman[213077]: 2025-12-03 00:00:01.102138706 +0000 UTC m=+0.056994639 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:00:01 compute-0 systemd[1]: logrotate.service: Deactivated successfully.
Dec 03 00:00:01 compute-0 systemd[1]: Finished Rotate log files.
Dec 03 00:00:01 compute-0 nova_compute[187243]: 2025-12-03 00:00:01.388 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:01 compute-0 openstack_network_exporter[199746]: ERROR   00:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:00:01 compute-0 openstack_network_exporter[199746]: ERROR   00:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:00:01 compute-0 openstack_network_exporter[199746]: ERROR   00:00:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:00:01 compute-0 openstack_network_exporter[199746]: ERROR   00:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:00:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:00:01 compute-0 openstack_network_exporter[199746]: ERROR   00:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:00:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:00:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:00:04.488 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:00:04 compute-0 nova_compute[187243]: 2025-12-03 00:00:04.877 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:05 compute-0 podman[213106]: 2025-12-03 00:00:05.132380971 +0000 UTC m=+0.078363492 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 03 00:00:05 compute-0 podman[213107]: 2025-12-03 00:00:05.189043631 +0000 UTC m=+0.129839500 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Dec 03 00:00:06 compute-0 nova_compute[187243]: 2025-12-03 00:00:06.389 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:09 compute-0 nova_compute[187243]: 2025-12-03 00:00:09.880 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:10 compute-0 ovn_controller[95488]: 2025-12-03T00:00:10Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:26:03:cc 10.100.0.12
Dec 03 00:00:10 compute-0 ovn_controller[95488]: 2025-12-03T00:00:10Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:26:03:cc 10.100.0.12
Dec 03 00:00:11 compute-0 nova_compute[187243]: 2025-12-03 00:00:11.431 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:14 compute-0 nova_compute[187243]: 2025-12-03 00:00:14.884 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:16 compute-0 nova_compute[187243]: 2025-12-03 00:00:16.433 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:17 compute-0 podman[213158]: 2025-12-03 00:00:17.107187292 +0000 UTC m=+0.060808236 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, release=1755695350, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 03 00:00:19 compute-0 nova_compute[187243]: 2025-12-03 00:00:19.886 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:21 compute-0 nova_compute[187243]: 2025-12-03 00:00:21.477 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:22 compute-0 podman[213179]: 2025-12-03 00:00:22.149524436 +0000 UTC m=+0.091901217 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 00:00:24 compute-0 nova_compute[187243]: 2025-12-03 00:00:24.935 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:26 compute-0 nova_compute[187243]: 2025-12-03 00:00:26.481 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:29 compute-0 podman[197600]: time="2025-12-03T00:00:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:00:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:00:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:00:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:00:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3055 "" "Go-http-client/1.1"
Dec 03 00:00:29 compute-0 nova_compute[187243]: 2025-12-03 00:00:29.936 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:31 compute-0 openstack_network_exporter[199746]: ERROR   00:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:00:31 compute-0 openstack_network_exporter[199746]: ERROR   00:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:00:31 compute-0 openstack_network_exporter[199746]: ERROR   00:00:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:00:31 compute-0 openstack_network_exporter[199746]: ERROR   00:00:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:00:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:00:31 compute-0 openstack_network_exporter[199746]: ERROR   00:00:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:00:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:00:31 compute-0 nova_compute[187243]: 2025-12-03 00:00:31.482 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:32 compute-0 podman[213201]: 2025-12-03 00:00:32.093337374 +0000 UTC m=+0.055053090 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:00:32 compute-0 nova_compute[187243]: 2025-12-03 00:00:32.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:32 compute-0 nova_compute[187243]: 2025-12-03 00:00:32.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:32 compute-0 nova_compute[187243]: 2025-12-03 00:00:32.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:32 compute-0 nova_compute[187243]: 2025-12-03 00:00:32.593 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:00:34 compute-0 nova_compute[187243]: 2025-12-03 00:00:34.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:34 compute-0 nova_compute[187243]: 2025-12-03 00:00:34.938 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:35 compute-0 nova_compute[187243]: 2025-12-03 00:00:35.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:35 compute-0 nova_compute[187243]: 2025-12-03 00:00:35.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:36 compute-0 podman[213227]: 2025-12-03 00:00:36.100224485 +0000 UTC m=+0.054786203 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 03 00:00:36 compute-0 podman[213228]: 2025-12-03 00:00:36.158375292 +0000 UTC m=+0.101285814 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:00:36 compute-0 sshd-session[213225]: Invalid user system from 102.210.148.92 port 54742
Dec 03 00:00:36 compute-0 nova_compute[187243]: 2025-12-03 00:00:36.484 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:36 compute-0 nova_compute[187243]: 2025-12-03 00:00:36.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:36 compute-0 sshd-session[213225]: Received disconnect from 102.210.148.92 port 54742:11: Bye Bye [preauth]
Dec 03 00:00:36 compute-0 sshd-session[213225]: Disconnected from invalid user system 102.210.148.92 port 54742 [preauth]
Dec 03 00:00:37 compute-0 nova_compute[187243]: 2025-12-03 00:00:37.108 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:37 compute-0 nova_compute[187243]: 2025-12-03 00:00:37.109 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:37 compute-0 nova_compute[187243]: 2025-12-03 00:00:37.109 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:37 compute-0 nova_compute[187243]: 2025-12-03 00:00:37.110 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:00:38 compute-0 nova_compute[187243]: 2025-12-03 00:00:38.233 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:38 compute-0 nova_compute[187243]: 2025-12-03 00:00:38.304 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:38 compute-0 nova_compute[187243]: 2025-12-03 00:00:38.306 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:38 compute-0 nova_compute[187243]: 2025-12-03 00:00:38.367 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:38 compute-0 nova_compute[187243]: 2025-12-03 00:00:38.500 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:00:38 compute-0 nova_compute[187243]: 2025-12-03 00:00:38.501 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:00:38 compute-0 nova_compute[187243]: 2025-12-03 00:00:38.528 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:00:38 compute-0 nova_compute[187243]: 2025-12-03 00:00:38.529 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5672MB free_disk=73.13734436035156GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:00:38 compute-0 nova_compute[187243]: 2025-12-03 00:00:38.529 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:00:38 compute-0 nova_compute[187243]: 2025-12-03 00:00:38.529 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:00:39 compute-0 nova_compute[187243]: 2025-12-03 00:00:39.648 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance f2790458-4312-4441-8f6b-e679cabd98c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:00:39 compute-0 nova_compute[187243]: 2025-12-03 00:00:39.648 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:00:39 compute-0 nova_compute[187243]: 2025-12-03 00:00:39.649 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:00:38 up  1:08,  0 user,  load average: 0.34, 0.34, 0.42\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_916fb9304c874baa83b72f5956839b66': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:00:39 compute-0 nova_compute[187243]: 2025-12-03 00:00:39.697 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:00:39 compute-0 sshd-session[213273]: Invalid user dd from 45.78.219.95 port 57822
Dec 03 00:00:39 compute-0 nova_compute[187243]: 2025-12-03 00:00:39.941 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:40 compute-0 nova_compute[187243]: 2025-12-03 00:00:40.206 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:00:40 compute-0 nova_compute[187243]: 2025-12-03 00:00:40.723 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:00:40 compute-0 nova_compute[187243]: 2025-12-03 00:00:40.723 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.194s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:00:40 compute-0 sshd-session[213273]: Received disconnect from 45.78.219.95 port 57822:11: Bye Bye [preauth]
Dec 03 00:00:40 compute-0 sshd-session[213273]: Disconnected from invalid user dd 45.78.219.95 port 57822 [preauth]
Dec 03 00:00:41 compute-0 nova_compute[187243]: 2025-12-03 00:00:41.518 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:41 compute-0 nova_compute[187243]: 2025-12-03 00:00:41.723 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:42 compute-0 nova_compute[187243]: 2025-12-03 00:00:42.234 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:00:44 compute-0 sshd-session[213282]: Invalid user system from 61.220.235.10 port 37472
Dec 03 00:00:44 compute-0 sshd-session[213282]: Received disconnect from 61.220.235.10 port 37472:11: Bye Bye [preauth]
Dec 03 00:00:44 compute-0 sshd-session[213282]: Disconnected from invalid user system 61.220.235.10 port 37472 [preauth]
Dec 03 00:00:44 compute-0 nova_compute[187243]: 2025-12-03 00:00:44.946 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:46 compute-0 nova_compute[187243]: 2025-12-03 00:00:46.519 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:46 compute-0 ovn_controller[95488]: 2025-12-03T00:00:46Z|00088|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Dec 03 00:00:48 compute-0 podman[213284]: 2025-12-03 00:00:48.10320313 +0000 UTC m=+0.060404046 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7)
Dec 03 00:00:49 compute-0 nova_compute[187243]: 2025-12-03 00:00:49.949 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:51 compute-0 nova_compute[187243]: 2025-12-03 00:00:51.092 187247 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Creating tmpfile /var/lib/nova/instances/tmp3pj4_prd to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 03 00:00:51 compute-0 nova_compute[187243]: 2025-12-03 00:00:51.093 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:00:51 compute-0 nova_compute[187243]: 2025-12-03 00:00:51.198 187247 DEBUG nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3pj4_prd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 03 00:00:51 compute-0 nova_compute[187243]: 2025-12-03 00:00:51.527 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:53 compute-0 podman[213306]: 2025-12-03 00:00:53.118349174 +0000 UTC m=+0.063834663 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 03 00:00:53 compute-0 nova_compute[187243]: 2025-12-03 00:00:53.242 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:00:54 compute-0 nova_compute[187243]: 2025-12-03 00:00:54.982 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:56 compute-0 nova_compute[187243]: 2025-12-03 00:00:56.528 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:00:57 compute-0 nova_compute[187243]: 2025-12-03 00:00:57.301 187247 DEBUG nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3pj4_prd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='00869cbc-c7e6-47b4-8d21-c0ac64fe6381',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 03 00:00:58 compute-0 nova_compute[187243]: 2025-12-03 00:00:58.348 187247 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:00:58 compute-0 nova_compute[187243]: 2025-12-03 00:00:58.348 187247 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:00:58 compute-0 nova_compute[187243]: 2025-12-03 00:00:58.348 187247 DEBUG nova.network.neutron [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:00:58 compute-0 nova_compute[187243]: 2025-12-03 00:00:58.854 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:00:59 compute-0 nova_compute[187243]: 2025-12-03 00:00:59.711 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:00:59 compute-0 podman[197600]: time="2025-12-03T00:00:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:00:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:00:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:00:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:00:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3058 "" "Go-http-client/1.1"
Dec 03 00:00:59 compute-0 nova_compute[187243]: 2025-12-03 00:00:59.951 187247 DEBUG nova.network.neutron [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Updating instance_info_cache with network_info: [{"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:00:59 compute-0 nova_compute[187243]: 2025-12-03 00:00:59.986 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:00 compute-0 nova_compute[187243]: 2025-12-03 00:01:00.602 187247 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:01:00 compute-0 nova_compute[187243]: 2025-12-03 00:01:00.621 187247 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3pj4_prd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='00869cbc-c7e6-47b4-8d21-c0ac64fe6381',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 03 00:01:00 compute-0 nova_compute[187243]: 2025-12-03 00:01:00.622 187247 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Creating instance directory: /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 03 00:01:00 compute-0 nova_compute[187243]: 2025-12-03 00:01:00.623 187247 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Creating disk.info with the contents: {'/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk': 'qcow2', '/var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 03 00:01:00 compute-0 nova_compute[187243]: 2025-12-03 00:01:00.623 187247 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 03 00:01:00 compute-0 nova_compute[187243]: 2025-12-03 00:01:00.623 187247 DEBUG nova.objects.instance [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 00869cbc-c7e6-47b4-8d21-c0ac64fe6381 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:01:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:00.686 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:00.686 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:00.687 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.129 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.136 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.138 187247 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.210 187247 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.212 187247 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.213 187247 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.214 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.220 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.220 187247 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.291 187247 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.292 187247 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.323 187247 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.324 187247 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.325 187247 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.375 187247 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.375 187247 DEBUG nova.virt.disk.api [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Checking if we can resize image /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.376 187247 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:01:01 compute-0 openstack_network_exporter[199746]: ERROR   00:01:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:01:01 compute-0 openstack_network_exporter[199746]: ERROR   00:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:01:01 compute-0 openstack_network_exporter[199746]: ERROR   00:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:01:01 compute-0 openstack_network_exporter[199746]: ERROR   00:01:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:01:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:01:01 compute-0 openstack_network_exporter[199746]: ERROR   00:01:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:01:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.495 187247 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk --force-share --output=json" returned: 0 in 0.119s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.496 187247 DEBUG nova.virt.disk.api [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Cannot resize image /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.496 187247 DEBUG nova.objects.instance [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 00869cbc-c7e6-47b4-8d21-c0ac64fe6381 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:01:01 compute-0 CROND[213349]: (root) CMD (run-parts /etc/cron.hourly)
Dec 03 00:01:01 compute-0 run-parts[213352]: (/etc/cron.hourly) starting 0anacron
Dec 03 00:01:01 compute-0 nova_compute[187243]: 2025-12-03 00:01:01.530 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:01 compute-0 anacron[213360]: Anacron started on 2025-12-03
Dec 03 00:01:01 compute-0 anacron[213360]: Job `cron.monthly' locked by another anacron - skipping
Dec 03 00:01:01 compute-0 anacron[213360]: Normal exit (0 jobs run)
Dec 03 00:01:01 compute-0 run-parts[213362]: (/etc/cron.hourly) finished 0anacron
Dec 03 00:01:01 compute-0 CROND[213348]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 03 00:01:01 compute-0 sshd-session[213343]: Invalid user tibero from 23.95.37.90 port 57526
Dec 03 00:01:01 compute-0 sshd-session[213343]: Received disconnect from 23.95.37.90 port 57526:11: Bye Bye [preauth]
Dec 03 00:01:01 compute-0 sshd-session[213343]: Disconnected from invalid user tibero 23.95.37.90 port 57526 [preauth]
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.002 187247 DEBUG nova.objects.base [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Object Instance<00869cbc-c7e6-47b4-8d21-c0ac64fe6381> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.003 187247 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.023 187247 DEBUG oslo_concurrency.processutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381/disk.config 497664" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.024 187247 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.026 187247 DEBUG nova.virt.libvirt.vif [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-03T00:00:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-361127533',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-361127533',id=11,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:00:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='916fb9304c874baa83b72f5956839b66',ramdisk_id='',reservation_id='r-yzg3uqar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-436376556',owner_user_name='tempest-TestExecuteBasicStrategy-436376556-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:00:18Z,user_data=None,user_id='f68e1c374dfc43b8a8431b13bafb13c8',uuid=00869cbc-c7e6-47b4-8d21-c0ac64fe6381,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.026 187247 DEBUG nova.network.os_vif_util [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.028 187247 DEBUG nova.network.os_vif_util [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.028 187247 DEBUG os_vif [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.029 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.029 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.030 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.031 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.031 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '7b988612-1da0-56ef-a77f-2b45e4c189a3', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.032 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.035 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.037 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.037 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b2c586f-1a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.037 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap4b2c586f-1a, col_values=(('qos', UUID('edf2b69d-bff2-4ace-9df8-1386bd7f7c2b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.038 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap4b2c586f-1a, col_values=(('external_ids', {'iface-id': '4b2c586f-1a7f-4c5d-a6a1-90abac987f19', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:55:7f', 'vm-uuid': '00869cbc-c7e6-47b4-8d21-c0ac64fe6381'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.038 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:02 compute-0 NetworkManager[55671]: <info>  [1764720062.0396] manager: (tap4b2c586f-1a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.040 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.044 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.045 187247 INFO os_vif [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a')
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.045 187247 DEBUG nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.045 187247 DEBUG nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3pj4_prd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='00869cbc-c7e6-47b4-8d21-c0ac64fe6381',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.046 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.168 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:02.524 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:01:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:02.526 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:01:02 compute-0 nova_compute[187243]: 2025-12-03 00:01:02.592 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:03 compute-0 podman[213371]: 2025-12-03 00:01:03.102360315 +0000 UTC m=+0.057953594 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:01:03 compute-0 nova_compute[187243]: 2025-12-03 00:01:03.582 187247 DEBUG nova.network.neutron [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Port 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 03 00:01:03 compute-0 nova_compute[187243]: 2025-12-03 00:01:03.591 187247 DEBUG nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3pj4_prd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='00869cbc-c7e6-47b4-8d21-c0ac64fe6381',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 03 00:01:06 compute-0 systemd[1]: Starting libvirt proxy daemon...
Dec 03 00:01:06 compute-0 systemd[1]: Started libvirt proxy daemon.
Dec 03 00:01:06 compute-0 podman[213396]: 2025-12-03 00:01:06.484638464 +0000 UTC m=+0.052251879 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:01:06 compute-0 podman[213397]: 2025-12-03 00:01:06.516711529 +0000 UTC m=+0.081659446 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202)
Dec 03 00:01:06 compute-0 nova_compute[187243]: 2025-12-03 00:01:06.531 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:06 compute-0 kernel: tap4b2c586f-1a: entered promiscuous mode
Dec 03 00:01:06 compute-0 ovn_controller[95488]: 2025-12-03T00:01:06Z|00089|binding|INFO|Claiming lport 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 for this additional chassis.
Dec 03 00:01:06 compute-0 ovn_controller[95488]: 2025-12-03T00:01:06Z|00090|binding|INFO|4b2c586f-1a7f-4c5d-a6a1-90abac987f19: Claiming fa:16:3e:f4:55:7f 10.100.0.14
Dec 03 00:01:06 compute-0 NetworkManager[55671]: <info>  [1764720066.5876] manager: (tap4b2c586f-1a): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Dec 03 00:01:06 compute-0 nova_compute[187243]: 2025-12-03 00:01:06.588 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:06 compute-0 ovn_controller[95488]: 2025-12-03T00:01:06Z|00091|binding|INFO|Setting lport 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 ovn-installed in OVS
Dec 03 00:01:06 compute-0 nova_compute[187243]: 2025-12-03 00:01:06.602 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.603 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:55:7f 10.100.0.14'], port_security=['fa:16:3e:f4:55:7f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '00869cbc-c7e6-47b4-8d21-c0ac64fe6381', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '916fb9304c874baa83b72f5956839b66', 'neutron:revision_number': '10', 'neutron:security_group_ids': '1093c49e-a0ca-44ab-a8bd-3c19ec9553c0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e22f98a-28c1-406a-8582-57ed07fee88b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=4b2c586f-1a7f-4c5d-a6a1-90abac987f19) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.603 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 in datapath 1c53a3e7-267c-42d7-8662-f773adcc4604 unbound from our chassis
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.606 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1c53a3e7-267c-42d7-8662-f773adcc4604
Dec 03 00:01:06 compute-0 nova_compute[187243]: 2025-12-03 00:01:06.609 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:06 compute-0 systemd-udevd[213474]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.620 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[05164806-7fab-417a-9a1d-a4a610a323ef]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:06 compute-0 systemd-machined[153518]: New machine qemu-7-instance-0000000b.
Dec 03 00:01:06 compute-0 NetworkManager[55671]: <info>  [1764720066.6334] device (tap4b2c586f-1a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:01:06 compute-0 NetworkManager[55671]: <info>  [1764720066.6343] device (tap4b2c586f-1a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:01:06 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-0000000b.
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.654 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[830e9636-c59c-43f1-9014-a34931b08942]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.656 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[bfef4c43-f065-4892-8efd-afccf8365d1a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.686 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[3711fe4c-fd1c-43a5-9235-644deda682dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.705 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[cae68853-0942-4b75-9834-4b6c749e9bf3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1c53a3e7-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:31:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408843, 'reachable_time': 36523, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213488, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.721 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[192138e5-9309-436f-8ff7-2719548f79a4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1c53a3e7-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408856, 'tstamp': 408856}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213489, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1c53a3e7-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408858, 'tstamp': 408858}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213489, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.723 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c53a3e7-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:06 compute-0 nova_compute[187243]: 2025-12-03 00:01:06.724 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:06 compute-0 nova_compute[187243]: 2025-12-03 00:01:06.725 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.726 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c53a3e7-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.726 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.726 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1c53a3e7-20, col_values=(('external_ids', {'iface-id': '1e62127c-f508-4e9e-bb5e-b8835c45c013'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.726 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:01:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:06.728 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[89c7d741-073d-4f75-893a-44a7a2cdc0c0]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-1c53a3e7-267c-42d7-8662-f773adcc4604\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 1c53a3e7-267c-42d7-8662-f773adcc4604\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:07 compute-0 nova_compute[187243]: 2025-12-03 00:01:07.040 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:08 compute-0 sshd-session[213369]: Connection closed by 45.78.218.154 port 41952 [preauth]
Dec 03 00:01:09 compute-0 ovn_controller[95488]: 2025-12-03T00:01:09Z|00092|binding|INFO|Claiming lport 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 for this chassis.
Dec 03 00:01:09 compute-0 ovn_controller[95488]: 2025-12-03T00:01:09Z|00093|binding|INFO|4b2c586f-1a7f-4c5d-a6a1-90abac987f19: Claiming fa:16:3e:f4:55:7f 10.100.0.14
Dec 03 00:01:09 compute-0 ovn_controller[95488]: 2025-12-03T00:01:09Z|00094|binding|INFO|Setting lport 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 up in Southbound
Dec 03 00:01:10 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:10.527 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:10 compute-0 nova_compute[187243]: 2025-12-03 00:01:10.895 187247 INFO nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Post operation of migration started
Dec 03 00:01:10 compute-0 nova_compute[187243]: 2025-12-03 00:01:10.895 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:11 compute-0 nova_compute[187243]: 2025-12-03 00:01:11.068 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:11 compute-0 nova_compute[187243]: 2025-12-03 00:01:11.068 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:11 compute-0 nova_compute[187243]: 2025-12-03 00:01:11.243 187247 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:01:11 compute-0 nova_compute[187243]: 2025-12-03 00:01:11.243 187247 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:01:11 compute-0 nova_compute[187243]: 2025-12-03 00:01:11.243 187247 DEBUG nova.network.neutron [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:01:11 compute-0 nova_compute[187243]: 2025-12-03 00:01:11.546 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:11 compute-0 nova_compute[187243]: 2025-12-03 00:01:11.810 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:12 compute-0 nova_compute[187243]: 2025-12-03 00:01:12.042 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:12 compute-0 nova_compute[187243]: 2025-12-03 00:01:12.438 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:12 compute-0 nova_compute[187243]: 2025-12-03 00:01:12.635 187247 DEBUG nova.network.neutron [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Updating instance_info_cache with network_info: [{"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:01:13 compute-0 nova_compute[187243]: 2025-12-03 00:01:13.141 187247 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-00869cbc-c7e6-47b4-8d21-c0ac64fe6381" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:01:13 compute-0 nova_compute[187243]: 2025-12-03 00:01:13.660 187247 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:13 compute-0 nova_compute[187243]: 2025-12-03 00:01:13.660 187247 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:13 compute-0 nova_compute[187243]: 2025-12-03 00:01:13.661 187247 DEBUG oslo_concurrency.lockutils [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:13 compute-0 nova_compute[187243]: 2025-12-03 00:01:13.665 187247 INFO nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 03 00:01:13 compute-0 virtqemud[186944]: Domain id=7 name='instance-0000000b' uuid=00869cbc-c7e6-47b4-8d21-c0ac64fe6381 is tainted: custom-monitor
Dec 03 00:01:14 compute-0 nova_compute[187243]: 2025-12-03 00:01:14.671 187247 INFO nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 03 00:01:15 compute-0 nova_compute[187243]: 2025-12-03 00:01:15.677 187247 INFO nova.virt.libvirt.driver [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 03 00:01:15 compute-0 nova_compute[187243]: 2025-12-03 00:01:15.681 187247 DEBUG nova.compute.manager [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:01:16 compute-0 nova_compute[187243]: 2025-12-03 00:01:16.201 187247 DEBUG nova.objects.instance [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 03 00:01:16 compute-0 nova_compute[187243]: 2025-12-03 00:01:16.548 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:17 compute-0 nova_compute[187243]: 2025-12-03 00:01:17.044 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:17 compute-0 nova_compute[187243]: 2025-12-03 00:01:17.225 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:17 compute-0 nova_compute[187243]: 2025-12-03 00:01:17.325 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:17 compute-0 nova_compute[187243]: 2025-12-03 00:01:17.325 187247 WARNING neutronclient.v2_0.client [None req-4a3e433e-8003-4674-b0d0-d31c4de67234 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:19 compute-0 podman[213513]: 2025-12-03 00:01:19.109418279 +0000 UTC m=+0.060547009 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.064 187247 DEBUG oslo_concurrency.lockutils [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.064 187247 DEBUG oslo_concurrency.lockutils [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.065 187247 DEBUG oslo_concurrency.lockutils [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.065 187247 DEBUG oslo_concurrency.lockutils [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.065 187247 DEBUG oslo_concurrency.lockutils [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.075 187247 INFO nova.compute.manager [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Terminating instance
Dec 03 00:01:20 compute-0 sshd-session[213511]: Invalid user thomas from 45.78.219.213 port 45390
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.599 187247 DEBUG nova.compute.manager [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:01:20 compute-0 kernel: tap4b2c586f-1a (unregistering): left promiscuous mode
Dec 03 00:01:20 compute-0 NetworkManager[55671]: <info>  [1764720080.6331] device (tap4b2c586f-1a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:01:20 compute-0 ovn_controller[95488]: 2025-12-03T00:01:20Z|00095|binding|INFO|Releasing lport 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 from this chassis (sb_readonly=0)
Dec 03 00:01:20 compute-0 ovn_controller[95488]: 2025-12-03T00:01:20Z|00096|binding|INFO|Setting lport 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 down in Southbound
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.636 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:20 compute-0 ovn_controller[95488]: 2025-12-03T00:01:20Z|00097|binding|INFO|Removing iface tap4b2c586f-1a ovn-installed in OVS
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.638 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.650 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:55:7f 10.100.0.14'], port_security=['fa:16:3e:f4:55:7f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '00869cbc-c7e6-47b4-8d21-c0ac64fe6381', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '916fb9304c874baa83b72f5956839b66', 'neutron:revision_number': '14', 'neutron:security_group_ids': '1093c49e-a0ca-44ab-a8bd-3c19ec9553c0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e22f98a-28c1-406a-8582-57ed07fee88b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=4b2c586f-1a7f-4c5d-a6a1-90abac987f19) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.651 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 4b2c586f-1a7f-4c5d-a6a1-90abac987f19 in datapath 1c53a3e7-267c-42d7-8662-f773adcc4604 unbound from our chassis
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.652 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1c53a3e7-267c-42d7-8662-f773adcc4604
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.655 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.668 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3a103b88-8953-4d87-89e9-c685b158ebe9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:20 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Dec 03 00:01:20 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Consumed 2.417s CPU time.
Dec 03 00:01:20 compute-0 systemd-machined[153518]: Machine qemu-7-instance-0000000b terminated.
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.694 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[57e957c7-e6cf-48e7-92cb-ecf7fb78412b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.696 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[74e7f9a3-8587-4ec3-addd-fbb2cd0f828a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:20 compute-0 sshd-session[213533]: Invalid user fiscal from 49.247.36.49 port 60418
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.724 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[55711b85-f849-48b5-a134-1173ea807969]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.739 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5b9411a4-a5cb-47b7-98e6-200c04bad50f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1c53a3e7-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:31:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408843, 'reachable_time': 36523, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213548, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.754 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2b7402da-a0c0-4b63-b61b-d3e54dd46006]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1c53a3e7-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408856, 'tstamp': 408856}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213549, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1c53a3e7-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408858, 'tstamp': 408858}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213549, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.755 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c53a3e7-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.756 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.760 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.760 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c53a3e7-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.760 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.760 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1c53a3e7-20, col_values=(('external_ids', {'iface-id': '1e62127c-f508-4e9e-bb5e-b8835c45c013'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.761 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:01:20 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:20.762 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[bb71e610-5d06-45ae-9bcf-c0d102a4df6d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-1c53a3e7-267c-42d7-8662-f773adcc4604\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 1c53a3e7-267c-42d7-8662-f773adcc4604\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.818 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.824 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.832 187247 DEBUG nova.compute.manager [req-4e29b367-a24c-4bf4-a276-fb89a4347dd2 req-7424022b-976b-4175-a079-571691b77b76 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.833 187247 DEBUG oslo_concurrency.lockutils [req-4e29b367-a24c-4bf4-a276-fb89a4347dd2 req-7424022b-976b-4175-a079-571691b77b76 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.833 187247 DEBUG oslo_concurrency.lockutils [req-4e29b367-a24c-4bf4-a276-fb89a4347dd2 req-7424022b-976b-4175-a079-571691b77b76 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.833 187247 DEBUG oslo_concurrency.lockutils [req-4e29b367-a24c-4bf4-a276-fb89a4347dd2 req-7424022b-976b-4175-a079-571691b77b76 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.833 187247 DEBUG nova.compute.manager [req-4e29b367-a24c-4bf4-a276-fb89a4347dd2 req-7424022b-976b-4175-a079-571691b77b76 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] No waiting events found dispatching network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.833 187247 DEBUG nova.compute.manager [req-4e29b367-a24c-4bf4-a276-fb89a4347dd2 req-7424022b-976b-4175-a079-571691b77b76 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.864 187247 INFO nova.virt.libvirt.driver [-] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Instance destroyed successfully.
Dec 03 00:01:20 compute-0 nova_compute[187243]: 2025-12-03 00:01:20.864 187247 DEBUG nova.objects.instance [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lazy-loading 'resources' on Instance uuid 00869cbc-c7e6-47b4-8d21-c0ac64fe6381 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:01:20 compute-0 sshd-session[213533]: Received disconnect from 49.247.36.49 port 60418:11: Bye Bye [preauth]
Dec 03 00:01:20 compute-0 sshd-session[213533]: Disconnected from invalid user fiscal 49.247.36.49 port 60418 [preauth]
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.373 187247 DEBUG nova.virt.libvirt.vif [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2025-12-03T00:00:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-361127533',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-361127533',id=11,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:00:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='916fb9304c874baa83b72f5956839b66',ramdisk_id='',reservation_id='r-yzg3uqar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-436376556',owner_user_name='tempest-TestExecuteBasicStrategy-436376556-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:01:16Z,user_data=None,user_id='f68e1c374dfc43b8a8431b13bafb13c8',uuid=00869cbc-c7e6-47b4-8d21-c0ac64fe6381,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.373 187247 DEBUG nova.network.os_vif_util [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Converting VIF {"id": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "address": "fa:16:3e:f4:55:7f", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2c586f-1a", "ovs_interfaceid": "4b2c586f-1a7f-4c5d-a6a1-90abac987f19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.373 187247 DEBUG nova.network.os_vif_util [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.374 187247 DEBUG os_vif [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.375 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.376 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b2c586f-1a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.415 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.416 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.417 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.417 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=edf2b69d-bff2-4ace-9df8-1386bd7f7c2b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.417 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.418 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.420 187247 INFO os_vif [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:55:7f,bridge_name='br-int',has_traffic_filtering=True,id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2c586f-1a')
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.420 187247 INFO nova.virt.libvirt.driver [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Deleting instance files /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381_del
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.421 187247 INFO nova.virt.libvirt.driver [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Deletion of /var/lib/nova/instances/00869cbc-c7e6-47b4-8d21-c0ac64fe6381_del complete
Dec 03 00:01:21 compute-0 nova_compute[187243]: 2025-12-03 00:01:21.549 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.005 187247 INFO nova.compute.manager [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Took 1.41 seconds to destroy the instance on the hypervisor.
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.006 187247 DEBUG oslo.service.backend._eventlet.loopingcall [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.006 187247 DEBUG nova.compute.manager [-] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.006 187247 DEBUG nova.network.neutron [-] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.006 187247 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.273 187247 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.580 187247 DEBUG nova.compute.manager [req-e94b257a-7e6d-4d22-aead-62d4dd089349 req-42830977-ac2c-4e55-8116-5b2dbca7c020 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-deleted-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.581 187247 INFO nova.compute.manager [req-e94b257a-7e6d-4d22-aead-62d4dd089349 req-42830977-ac2c-4e55-8116-5b2dbca7c020 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Neutron deleted interface 4b2c586f-1a7f-4c5d-a6a1-90abac987f19; detaching it from the instance and deleting it from the info cache
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.581 187247 DEBUG nova.network.neutron [req-e94b257a-7e6d-4d22-aead-62d4dd089349 req-42830977-ac2c-4e55-8116-5b2dbca7c020 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.889 187247 DEBUG nova.compute.manager [req-7c91c491-cfc9-4c90-add9-35445347dec1 req-a83d0879-a4a2-425b-94e6-8c11d1f07886 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.889 187247 DEBUG oslo_concurrency.lockutils [req-7c91c491-cfc9-4c90-add9-35445347dec1 req-a83d0879-a4a2-425b-94e6-8c11d1f07886 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.889 187247 DEBUG oslo_concurrency.lockutils [req-7c91c491-cfc9-4c90-add9-35445347dec1 req-a83d0879-a4a2-425b-94e6-8c11d1f07886 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.890 187247 DEBUG oslo_concurrency.lockutils [req-7c91c491-cfc9-4c90-add9-35445347dec1 req-a83d0879-a4a2-425b-94e6-8c11d1f07886 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.890 187247 DEBUG nova.compute.manager [req-7c91c491-cfc9-4c90-add9-35445347dec1 req-a83d0879-a4a2-425b-94e6-8c11d1f07886 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] No waiting events found dispatching network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.890 187247 DEBUG nova.compute.manager [req-7c91c491-cfc9-4c90-add9-35445347dec1 req-a83d0879-a4a2-425b-94e6-8c11d1f07886 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Received event network-vif-unplugged-4b2c586f-1a7f-4c5d-a6a1-90abac987f19 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:01:22 compute-0 nova_compute[187243]: 2025-12-03 00:01:22.997 187247 DEBUG nova.network.neutron [-] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:01:23 compute-0 sshd-session[213511]: Received disconnect from 45.78.219.213 port 45390:11: Bye Bye [preauth]
Dec 03 00:01:23 compute-0 sshd-session[213511]: Disconnected from invalid user thomas 45.78.219.213 port 45390 [preauth]
Dec 03 00:01:23 compute-0 nova_compute[187243]: 2025-12-03 00:01:23.087 187247 DEBUG nova.compute.manager [req-e94b257a-7e6d-4d22-aead-62d4dd089349 req-42830977-ac2c-4e55-8116-5b2dbca7c020 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Detach interface failed, port_id=4b2c586f-1a7f-4c5d-a6a1-90abac987f19, reason: Instance 00869cbc-c7e6-47b4-8d21-c0ac64fe6381 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:01:23 compute-0 nova_compute[187243]: 2025-12-03 00:01:23.504 187247 INFO nova.compute.manager [-] [instance: 00869cbc-c7e6-47b4-8d21-c0ac64fe6381] Took 1.50 seconds to deallocate network for instance.
Dec 03 00:01:24 compute-0 podman[213568]: 2025-12-03 00:01:24.093175236 +0000 UTC m=+0.048913834 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:01:24 compute-0 nova_compute[187243]: 2025-12-03 00:01:24.459 187247 DEBUG oslo_concurrency.lockutils [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:24 compute-0 nova_compute[187243]: 2025-12-03 00:01:24.459 187247 DEBUG oslo_concurrency.lockutils [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:24 compute-0 nova_compute[187243]: 2025-12-03 00:01:24.465 187247 DEBUG oslo_concurrency.lockutils [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:24 compute-0 nova_compute[187243]: 2025-12-03 00:01:24.506 187247 INFO nova.scheduler.client.report [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Deleted allocations for instance 00869cbc-c7e6-47b4-8d21-c0ac64fe6381
Dec 03 00:01:25 compute-0 nova_compute[187243]: 2025-12-03 00:01:25.551 187247 DEBUG oslo_concurrency.lockutils [None req-ee7e9c84-1f7c-49cb-b140-f812e2accc71 f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "00869cbc-c7e6-47b4-8d21-c0ac64fe6381" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.487s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.190 187247 DEBUG oslo_concurrency.lockutils [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "f2790458-4312-4441-8f6b-e679cabd98c5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.191 187247 DEBUG oslo_concurrency.lockutils [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.191 187247 DEBUG oslo_concurrency.lockutils [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.191 187247 DEBUG oslo_concurrency.lockutils [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.192 187247 DEBUG oslo_concurrency.lockutils [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.210 187247 INFO nova.compute.manager [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Terminating instance
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.419 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.550 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.728 187247 DEBUG nova.compute.manager [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:01:26 compute-0 kernel: tap010323f8-55 (unregistering): left promiscuous mode
Dec 03 00:01:26 compute-0 NetworkManager[55671]: <info>  [1764720086.7531] device (tap010323f8-55): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.763 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:26 compute-0 ovn_controller[95488]: 2025-12-03T00:01:26Z|00098|binding|INFO|Releasing lport 010323f8-551b-4929-b180-ea6b100e6d9c from this chassis (sb_readonly=0)
Dec 03 00:01:26 compute-0 ovn_controller[95488]: 2025-12-03T00:01:26Z|00099|binding|INFO|Setting lport 010323f8-551b-4929-b180-ea6b100e6d9c down in Southbound
Dec 03 00:01:26 compute-0 ovn_controller[95488]: 2025-12-03T00:01:26Z|00100|binding|INFO|Removing iface tap010323f8-55 ovn-installed in OVS
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.766 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:26 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:26.774 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:03:cc 10.100.0.12'], port_security=['fa:16:3e:26:03:cc 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f2790458-4312-4441-8f6b-e679cabd98c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c53a3e7-267c-42d7-8662-f773adcc4604', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '916fb9304c874baa83b72f5956839b66', 'neutron:revision_number': '5', 'neutron:security_group_ids': '1093c49e-a0ca-44ab-a8bd-3c19ec9553c0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e22f98a-28c1-406a-8582-57ed07fee88b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=010323f8-551b-4929-b180-ea6b100e6d9c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:01:26 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:26.775 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 010323f8-551b-4929-b180-ea6b100e6d9c in datapath 1c53a3e7-267c-42d7-8662-f773adcc4604 unbound from our chassis
Dec 03 00:01:26 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:26.777 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1c53a3e7-267c-42d7-8662-f773adcc4604, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:01:26 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:26.778 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1293b165-775d-467e-8f34-f7cf4f7f1d6f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:26 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:26.778 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604 namespace which is not needed anymore
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.781 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:26 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Dec 03 00:01:26 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000a.scope: Consumed 15.527s CPU time.
Dec 03 00:01:26 compute-0 systemd-machined[153518]: Machine qemu-6-instance-0000000a terminated.
Dec 03 00:01:26 compute-0 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[213060]: [NOTICE]   (213064) : haproxy version is 3.0.5-8e879a5
Dec 03 00:01:26 compute-0 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[213060]: [NOTICE]   (213064) : path to executable is /usr/sbin/haproxy
Dec 03 00:01:26 compute-0 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[213060]: [WARNING]  (213064) : Exiting Master process...
Dec 03 00:01:26 compute-0 podman[213616]: 2025-12-03 00:01:26.898520324 +0000 UTC m=+0.031579374 container kill ad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:01:26 compute-0 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[213060]: [ALERT]    (213064) : Current worker (213066) exited with code 143 (Terminated)
Dec 03 00:01:26 compute-0 neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604[213060]: [WARNING]  (213064) : All workers exited. Exiting... (0)
Dec 03 00:01:26 compute-0 systemd[1]: libpod-ad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b.scope: Deactivated successfully.
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.903 187247 DEBUG nova.compute.manager [req-2a2ffc75-5e93-40a8-9881-0625f2c21911 req-2a1e7a1b-cf8d-4477-bfe9-a435c4930460 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Received event network-vif-unplugged-010323f8-551b-4929-b180-ea6b100e6d9c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.903 187247 DEBUG oslo_concurrency.lockutils [req-2a2ffc75-5e93-40a8-9881-0625f2c21911 req-2a1e7a1b-cf8d-4477-bfe9-a435c4930460 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.903 187247 DEBUG oslo_concurrency.lockutils [req-2a2ffc75-5e93-40a8-9881-0625f2c21911 req-2a1e7a1b-cf8d-4477-bfe9-a435c4930460 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.904 187247 DEBUG oslo_concurrency.lockutils [req-2a2ffc75-5e93-40a8-9881-0625f2c21911 req-2a1e7a1b-cf8d-4477-bfe9-a435c4930460 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.904 187247 DEBUG nova.compute.manager [req-2a2ffc75-5e93-40a8-9881-0625f2c21911 req-2a1e7a1b-cf8d-4477-bfe9-a435c4930460 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] No waiting events found dispatching network-vif-unplugged-010323f8-551b-4929-b180-ea6b100e6d9c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.904 187247 DEBUG nova.compute.manager [req-2a2ffc75-5e93-40a8-9881-0625f2c21911 req-2a1e7a1b-cf8d-4477-bfe9-a435c4930460 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Received event network-vif-unplugged-010323f8-551b-4929-b180-ea6b100e6d9c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:01:26 compute-0 podman[213633]: 2025-12-03 00:01:26.93932275 +0000 UTC m=+0.022496512 container died ad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.947 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.951 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b-userdata-shm.mount: Deactivated successfully.
Dec 03 00:01:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-ea5e40e13375da801160b3b6f5430466727f2d43404c81353992bf80943af951-merged.mount: Deactivated successfully.
Dec 03 00:01:26 compute-0 podman[213633]: 2025-12-03 00:01:26.988972502 +0000 UTC m=+0.072146264 container cleanup ad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.988 187247 INFO nova.virt.libvirt.driver [-] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Instance destroyed successfully.
Dec 03 00:01:26 compute-0 nova_compute[187243]: 2025-12-03 00:01:26.989 187247 DEBUG nova.objects.instance [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lazy-loading 'resources' on Instance uuid f2790458-4312-4441-8f6b-e679cabd98c5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:01:26 compute-0 systemd[1]: libpod-conmon-ad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b.scope: Deactivated successfully.
Dec 03 00:01:27 compute-0 podman[213634]: 2025-12-03 00:01:27.006876537 +0000 UTC m=+0.084004876 container remove ad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:01:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:27.012 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a04eaa03-905c-4550-86ae-0d2604716b6a]: (4, ("Wed Dec  3 12:01:26 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604 (ad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b)\nad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b\nWed Dec  3 12:01:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604 (ad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b)\nad7fc505703ed71bff94bbb613a7a4cef9454eb7e84b02055845469183fe003b\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:27.014 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[db64296f-3642-4f15-8cae-06de14ce5004]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:27.014 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1c53a3e7-267c-42d7-8662-f773adcc4604.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:01:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:27.014 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[da45c88a-e857-4a31-8494-501eb113c399]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:27.015 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c53a3e7-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.016 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:27 compute-0 kernel: tap1c53a3e7-20: left promiscuous mode
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.030 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:27.032 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4350511c-f79a-4349-b0bf-f24a1b5654e1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:27.049 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6cccf103-7182-464d-991d-e09b2e00d799]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:27.051 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[905aa59c-67d4-46e1-bf67-32d9a2449731]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:27.065 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[aeaaf9ad-1b79-46e3-96a6-5fc99d891a8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408835, 'reachable_time': 21177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213681, 'error': None, 'target': 'ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:27.069 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1c53a3e7-267c-42d7-8662-f773adcc4604 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:01:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:27.069 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[e38b3938-449b-4511-8e41-93733eda39e9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:27 compute-0 systemd[1]: run-netns-ovnmeta\x2d1c53a3e7\x2d267c\x2d42d7\x2d8662\x2df773adcc4604.mount: Deactivated successfully.
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.497 187247 DEBUG nova.virt.libvirt.vif [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-02T23:59:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-921027',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-921027',id=10,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-02T23:59:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='916fb9304c874baa83b72f5956839b66',ramdisk_id='',reservation_id='r-yb3cgrkd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-436376556',owner_user_name='tempest-TestExecuteBasicStrategy-436376556-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-02T23:59:59Z,user_data=None,user_id='f68e1c374dfc43b8a8431b13bafb13c8',uuid=f2790458-4312-4441-8f6b-e679cabd98c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "010323f8-551b-4929-b180-ea6b100e6d9c", "address": "fa:16:3e:26:03:cc", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap010323f8-55", "ovs_interfaceid": "010323f8-551b-4929-b180-ea6b100e6d9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.498 187247 DEBUG nova.network.os_vif_util [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Converting VIF {"id": "010323f8-551b-4929-b180-ea6b100e6d9c", "address": "fa:16:3e:26:03:cc", "network": {"id": "1c53a3e7-267c-42d7-8662-f773adcc4604", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1773980857-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86bd5114f990455bad9eb03145bbd520", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap010323f8-55", "ovs_interfaceid": "010323f8-551b-4929-b180-ea6b100e6d9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.499 187247 DEBUG nova.network.os_vif_util [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:03:cc,bridge_name='br-int',has_traffic_filtering=True,id=010323f8-551b-4929-b180-ea6b100e6d9c,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap010323f8-55') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.499 187247 DEBUG os_vif [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:03:cc,bridge_name='br-int',has_traffic_filtering=True,id=010323f8-551b-4929-b180-ea6b100e6d9c,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap010323f8-55') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.500 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.500 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap010323f8-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.502 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.504 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.504 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.504 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=9bf59761-7f43-47d1-885d-601b1019de50) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.505 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.506 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.508 187247 INFO os_vif [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:03:cc,bridge_name='br-int',has_traffic_filtering=True,id=010323f8-551b-4929-b180-ea6b100e6d9c,network=Network(1c53a3e7-267c-42d7-8662-f773adcc4604),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap010323f8-55')
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.508 187247 INFO nova.virt.libvirt.driver [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Deleting instance files /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5_del
Dec 03 00:01:27 compute-0 nova_compute[187243]: 2025-12-03 00:01:27.509 187247 INFO nova.virt.libvirt.driver [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Deletion of /var/lib/nova/instances/f2790458-4312-4441-8f6b-e679cabd98c5_del complete
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.022 187247 INFO nova.compute.manager [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Took 1.29 seconds to destroy the instance on the hypervisor.
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.022 187247 DEBUG oslo.service.backend._eventlet.loopingcall [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.022 187247 DEBUG nova.compute.manager [-] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.022 187247 DEBUG nova.network.neutron [-] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.023 187247 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.277 187247 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.661 187247 DEBUG nova.compute.manager [req-3e7ff986-d5df-42c5-a504-42dd3442bc92 req-f3792264-efe6-48f1-ba61-73fa5cc4edf2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Received event network-vif-deleted-010323f8-551b-4929-b180-ea6b100e6d9c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.662 187247 INFO nova.compute.manager [req-3e7ff986-d5df-42c5-a504-42dd3442bc92 req-f3792264-efe6-48f1-ba61-73fa5cc4edf2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Neutron deleted interface 010323f8-551b-4929-b180-ea6b100e6d9c; detaching it from the instance and deleting it from the info cache
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.662 187247 DEBUG nova.network.neutron [req-3e7ff986-d5df-42c5-a504-42dd3442bc92 req-f3792264-efe6-48f1-ba61-73fa5cc4edf2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.958 187247 DEBUG nova.compute.manager [req-37099b51-f93d-4abb-82ba-e112661a5095 req-74cc289d-bb31-4560-b458-e5048a4f4d65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Received event network-vif-unplugged-010323f8-551b-4929-b180-ea6b100e6d9c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.958 187247 DEBUG oslo_concurrency.lockutils [req-37099b51-f93d-4abb-82ba-e112661a5095 req-74cc289d-bb31-4560-b458-e5048a4f4d65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.958 187247 DEBUG oslo_concurrency.lockutils [req-37099b51-f93d-4abb-82ba-e112661a5095 req-74cc289d-bb31-4560-b458-e5048a4f4d65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.958 187247 DEBUG oslo_concurrency.lockutils [req-37099b51-f93d-4abb-82ba-e112661a5095 req-74cc289d-bb31-4560-b458-e5048a4f4d65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.959 187247 DEBUG nova.compute.manager [req-37099b51-f93d-4abb-82ba-e112661a5095 req-74cc289d-bb31-4560-b458-e5048a4f4d65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] No waiting events found dispatching network-vif-unplugged-010323f8-551b-4929-b180-ea6b100e6d9c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:01:28 compute-0 nova_compute[187243]: 2025-12-03 00:01:28.959 187247 DEBUG nova.compute.manager [req-37099b51-f93d-4abb-82ba-e112661a5095 req-74cc289d-bb31-4560-b458-e5048a4f4d65 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Received event network-vif-unplugged-010323f8-551b-4929-b180-ea6b100e6d9c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:01:29 compute-0 nova_compute[187243]: 2025-12-03 00:01:29.061 187247 DEBUG nova.network.neutron [-] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:01:29 compute-0 nova_compute[187243]: 2025-12-03 00:01:29.176 187247 DEBUG nova.compute.manager [req-3e7ff986-d5df-42c5-a504-42dd3442bc92 req-f3792264-efe6-48f1-ba61-73fa5cc4edf2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Detach interface failed, port_id=010323f8-551b-4929-b180-ea6b100e6d9c, reason: Instance f2790458-4312-4441-8f6b-e679cabd98c5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:01:29 compute-0 nova_compute[187243]: 2025-12-03 00:01:29.568 187247 INFO nova.compute.manager [-] [instance: f2790458-4312-4441-8f6b-e679cabd98c5] Took 1.55 seconds to deallocate network for instance.
Dec 03 00:01:29 compute-0 nova_compute[187243]: 2025-12-03 00:01:29.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:29 compute-0 podman[197600]: time="2025-12-03T00:01:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:01:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:01:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:01:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:01:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Dec 03 00:01:30 compute-0 nova_compute[187243]: 2025-12-03 00:01:30.091 187247 DEBUG oslo_concurrency.lockutils [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:30 compute-0 nova_compute[187243]: 2025-12-03 00:01:30.091 187247 DEBUG oslo_concurrency.lockutils [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:30 compute-0 nova_compute[187243]: 2025-12-03 00:01:30.157 187247 DEBUG nova.compute.provider_tree [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:01:30 compute-0 nova_compute[187243]: 2025-12-03 00:01:30.666 187247 DEBUG nova.scheduler.client.report [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:01:31 compute-0 nova_compute[187243]: 2025-12-03 00:01:31.188 187247 DEBUG oslo_concurrency.lockutils [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.096s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:31 compute-0 nova_compute[187243]: 2025-12-03 00:01:31.223 187247 INFO nova.scheduler.client.report [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Deleted allocations for instance f2790458-4312-4441-8f6b-e679cabd98c5
Dec 03 00:01:31 compute-0 openstack_network_exporter[199746]: ERROR   00:01:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:01:31 compute-0 openstack_network_exporter[199746]: ERROR   00:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:01:31 compute-0 openstack_network_exporter[199746]: ERROR   00:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:01:31 compute-0 openstack_network_exporter[199746]: ERROR   00:01:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:01:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:01:31 compute-0 openstack_network_exporter[199746]: ERROR   00:01:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:01:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:01:31 compute-0 nova_compute[187243]: 2025-12-03 00:01:31.551 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:31 compute-0 nova_compute[187243]: 2025-12-03 00:01:31.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:31 compute-0 nova_compute[187243]: 2025-12-03 00:01:31.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:01:32 compute-0 nova_compute[187243]: 2025-12-03 00:01:32.257 187247 DEBUG oslo_concurrency.lockutils [None req-2c582661-493b-4288-9103-d7b125f39d7d f68e1c374dfc43b8a8431b13bafb13c8 916fb9304c874baa83b72f5956839b66 - - default default] Lock "f2790458-4312-4441-8f6b-e679cabd98c5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.066s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:32 compute-0 nova_compute[187243]: 2025-12-03 00:01:32.506 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:32 compute-0 nova_compute[187243]: 2025-12-03 00:01:32.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:34 compute-0 podman[213682]: 2025-12-03 00:01:34.121366118 +0000 UTC m=+0.077533071 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:01:35 compute-0 nova_compute[187243]: 2025-12-03 00:01:35.587 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:36 compute-0 sshd-session[213709]: Invalid user jenkins from 20.123.120.169 port 42112
Dec 03 00:01:36 compute-0 sshd-session[213709]: Received disconnect from 20.123.120.169 port 42112:11: Bye Bye [preauth]
Dec 03 00:01:36 compute-0 sshd-session[213709]: Disconnected from invalid user jenkins 20.123.120.169 port 42112 [preauth]
Dec 03 00:01:36 compute-0 nova_compute[187243]: 2025-12-03 00:01:36.553 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:36 compute-0 nova_compute[187243]: 2025-12-03 00:01:36.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:36 compute-0 nova_compute[187243]: 2025-12-03 00:01:36.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:37 compute-0 podman[213711]: 2025-12-03 00:01:37.099412915 +0000 UTC m=+0.052603167 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:01:37 compute-0 nova_compute[187243]: 2025-12-03 00:01:37.104 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:37 compute-0 nova_compute[187243]: 2025-12-03 00:01:37.104 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:37 compute-0 nova_compute[187243]: 2025-12-03 00:01:37.104 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:37 compute-0 nova_compute[187243]: 2025-12-03 00:01:37.105 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:01:37 compute-0 podman[213712]: 2025-12-03 00:01:37.127244433 +0000 UTC m=+0.076030783 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202)
Dec 03 00:01:37 compute-0 nova_compute[187243]: 2025-12-03 00:01:37.245 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:01:37 compute-0 nova_compute[187243]: 2025-12-03 00:01:37.246 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:01:37 compute-0 nova_compute[187243]: 2025-12-03 00:01:37.262 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:01:37 compute-0 nova_compute[187243]: 2025-12-03 00:01:37.263 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5821MB free_disk=73.16487503051758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:01:37 compute-0 nova_compute[187243]: 2025-12-03 00:01:37.263 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:01:37 compute-0 nova_compute[187243]: 2025-12-03 00:01:37.263 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:01:37 compute-0 nova_compute[187243]: 2025-12-03 00:01:37.556 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:38 compute-0 nova_compute[187243]: 2025-12-03 00:01:38.308 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:01:38 compute-0 nova_compute[187243]: 2025-12-03 00:01:38.309 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:01:37 up  1:09,  0 user,  load average: 0.12, 0.28, 0.39\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:01:38 compute-0 nova_compute[187243]: 2025-12-03 00:01:38.324 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing inventories for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:01:38 compute-0 nova_compute[187243]: 2025-12-03 00:01:38.345 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating ProviderTree inventory for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:01:38 compute-0 nova_compute[187243]: 2025-12-03 00:01:38.345 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:01:38 compute-0 nova_compute[187243]: 2025-12-03 00:01:38.359 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing aggregate associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:01:38 compute-0 nova_compute[187243]: 2025-12-03 00:01:38.375 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing trait associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_SOUND_MODEL_AC97,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOUND_MODEL_ICH9,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:01:38 compute-0 nova_compute[187243]: 2025-12-03 00:01:38.395 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:01:38 compute-0 nova_compute[187243]: 2025-12-03 00:01:38.901 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:01:39 compute-0 nova_compute[187243]: 2025-12-03 00:01:39.413 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:01:39 compute-0 nova_compute[187243]: 2025-12-03 00:01:39.414 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.151s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:01:40 compute-0 nova_compute[187243]: 2025-12-03 00:01:40.414 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:40 compute-0 nova_compute[187243]: 2025-12-03 00:01:40.414 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:01:41 compute-0 nova_compute[187243]: 2025-12-03 00:01:41.600 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:42 compute-0 nova_compute[187243]: 2025-12-03 00:01:42.559 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:46 compute-0 nova_compute[187243]: 2025-12-03 00:01:46.602 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:47 compute-0 nova_compute[187243]: 2025-12-03 00:01:47.611 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:48.041 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:78:55 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-745be26c-0cf1-4daa-aa35-3c721fbf4717', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-745be26c-0cf1-4daa-aa35-3c721fbf4717', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0f290453794d4fa8afe33607b761758b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0316d60-1ac4-42e4-998e-71514e598331, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a28c2153-d1ad-49de-a81a-3dcc04970435) old=Port_Binding(mac=['fa:16:3e:f4:78:55'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-745be26c-0cf1-4daa-aa35-3c721fbf4717', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-745be26c-0cf1-4daa-aa35-3c721fbf4717', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0f290453794d4fa8afe33607b761758b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:01:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:48.042 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a28c2153-d1ad-49de-a81a-3dcc04970435 in datapath 745be26c-0cf1-4daa-aa35-3c721fbf4717 updated
Dec 03 00:01:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:48.043 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 745be26c-0cf1-4daa-aa35-3c721fbf4717, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:01:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:48.044 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[21e97b42-1179-4f4f-a51c-57a71d849d83]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:48 compute-0 sshd-session[213756]: Invalid user sipv from 102.210.148.92 port 37000
Dec 03 00:01:48 compute-0 sshd-session[213756]: Received disconnect from 102.210.148.92 port 37000:11: Bye Bye [preauth]
Dec 03 00:01:48 compute-0 sshd-session[213756]: Disconnected from invalid user sipv 102.210.148.92 port 37000 [preauth]
Dec 03 00:01:50 compute-0 podman[213758]: 2025-12-03 00:01:50.10741139 +0000 UTC m=+0.055704057 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal)
Dec 03 00:01:51 compute-0 nova_compute[187243]: 2025-12-03 00:01:51.604 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:52 compute-0 nova_compute[187243]: 2025-12-03 00:01:52.655 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:55 compute-0 podman[213780]: 2025-12-03 00:01:55.090387824 +0000 UTC m=+0.050482663 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 03 00:01:55 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:55.758 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:75:d4 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-33af606c-dc30-4673-b583-9dcb920ad7fd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33af606c-dc30-4673-b583-9dcb920ad7fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3dee725c2ed74441890102c62cd79f8e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8d40e54-333e-46e7-95bd-2ddce91fa81d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fcfdc77d-119e-456c-8cca-ff658d4ac68e) old=Port_Binding(mac=['fa:16:3e:46:75:d4'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-33af606c-dc30-4673-b583-9dcb920ad7fd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33af606c-dc30-4673-b583-9dcb920ad7fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3dee725c2ed74441890102c62cd79f8e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:01:55 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:55.759 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fcfdc77d-119e-456c-8cca-ff658d4ac68e in datapath 33af606c-dc30-4673-b583-9dcb920ad7fd updated
Dec 03 00:01:55 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:55.760 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 33af606c-dc30-4673-b583-9dcb920ad7fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:01:55 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:01:55.761 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[090d89e5-3597-4a31-bd4e-b92f2ccda151]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:01:56 compute-0 nova_compute[187243]: 2025-12-03 00:01:56.606 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:57 compute-0 nova_compute[187243]: 2025-12-03 00:01:57.656 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:01:59 compute-0 podman[197600]: time="2025-12-03T00:01:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:01:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:01:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:01:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:01:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Dec 03 00:02:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:00.688 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:02:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:00.689 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:02:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:00.689 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:02:01 compute-0 openstack_network_exporter[199746]: ERROR   00:02:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:02:01 compute-0 openstack_network_exporter[199746]: ERROR   00:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:02:01 compute-0 openstack_network_exporter[199746]: ERROR   00:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:02:01 compute-0 openstack_network_exporter[199746]: ERROR   00:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:02:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:02:01 compute-0 openstack_network_exporter[199746]: ERROR   00:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:02:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:02:01 compute-0 nova_compute[187243]: 2025-12-03 00:02:01.656 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:02 compute-0 nova_compute[187243]: 2025-12-03 00:02:02.666 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:05 compute-0 podman[213803]: 2025-12-03 00:02:05.093301231 +0000 UTC m=+0.049256501 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:02:06 compute-0 nova_compute[187243]: 2025-12-03 00:02:06.661 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:07 compute-0 nova_compute[187243]: 2025-12-03 00:02:07.676 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:08 compute-0 podman[213825]: 2025-12-03 00:02:08.092870935 +0000 UTC m=+0.050325456 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 03 00:02:08 compute-0 podman[213826]: 2025-12-03 00:02:08.141817827 +0000 UTC m=+0.096117801 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 03 00:02:09 compute-0 sshd-session[213870]: Invalid user username from 61.220.235.10 port 36626
Dec 03 00:02:09 compute-0 sshd-session[213870]: Received disconnect from 61.220.235.10 port 36626:11: Bye Bye [preauth]
Dec 03 00:02:09 compute-0 sshd-session[213870]: Disconnected from invalid user username 61.220.235.10 port 36626 [preauth]
Dec 03 00:02:10 compute-0 ovn_controller[95488]: 2025-12-03T00:02:10Z|00101|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Dec 03 00:02:11 compute-0 nova_compute[187243]: 2025-12-03 00:02:11.699 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:12 compute-0 nova_compute[187243]: 2025-12-03 00:02:12.679 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:14 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:14.228 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:02:14 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:14.228 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:02:14 compute-0 nova_compute[187243]: 2025-12-03 00:02:14.236 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:16 compute-0 nova_compute[187243]: 2025-12-03 00:02:16.719 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:17 compute-0 nova_compute[187243]: 2025-12-03 00:02:17.682 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:21 compute-0 podman[213873]: 2025-12-03 00:02:21.090541932 +0000 UTC m=+0.051263509 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=)
Dec 03 00:02:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:21.229 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:02:21 compute-0 nova_compute[187243]: 2025-12-03 00:02:21.776 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:22 compute-0 nova_compute[187243]: 2025-12-03 00:02:22.685 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:23 compute-0 sshd-session[213897]: Invalid user test from 23.95.37.90 port 36908
Dec 03 00:02:23 compute-0 sshd-session[213897]: Received disconnect from 23.95.37.90 port 36908:11: Bye Bye [preauth]
Dec 03 00:02:23 compute-0 sshd-session[213897]: Disconnected from invalid user test 23.95.37.90 port 36908 [preauth]
Dec 03 00:02:26 compute-0 podman[213899]: 2025-12-03 00:02:26.089318982 +0000 UTC m=+0.051131216 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:02:26 compute-0 nova_compute[187243]: 2025-12-03 00:02:26.777 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:26 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:26.778 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:bb:c6 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '722828099f1644218029b73eaf67d6b4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=add6ea4f-8836-4bed-8f1e-39e943ccf4b5) old=Port_Binding(mac=['fa:16:3e:6d:bb:c6'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '722828099f1644218029b73eaf67d6b4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:02:26 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:26.779 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port add6ea4f-8836-4bed-8f1e-39e943ccf4b5 in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb updated
Dec 03 00:02:26 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:26.780 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed11b71b-745b-4f0c-9f09-37d53d166bcb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:02:26 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:26.781 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9276c1c6-5554-414d-91ab-f8d6f64ef9bf]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:02:27 compute-0 nova_compute[187243]: 2025-12-03 00:02:27.725 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:29 compute-0 podman[197600]: time="2025-12-03T00:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:02:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:02:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:02:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Dec 03 00:02:31 compute-0 openstack_network_exporter[199746]: ERROR   00:02:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:02:31 compute-0 openstack_network_exporter[199746]: ERROR   00:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:02:31 compute-0 openstack_network_exporter[199746]: ERROR   00:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:02:31 compute-0 openstack_network_exporter[199746]: ERROR   00:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:02:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:02:31 compute-0 openstack_network_exporter[199746]: ERROR   00:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:02:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:02:31 compute-0 nova_compute[187243]: 2025-12-03 00:02:31.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:31 compute-0 nova_compute[187243]: 2025-12-03 00:02:31.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:31 compute-0 nova_compute[187243]: 2025-12-03 00:02:31.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:02:31 compute-0 nova_compute[187243]: 2025-12-03 00:02:31.779 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:32 compute-0 nova_compute[187243]: 2025-12-03 00:02:32.774 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:34 compute-0 nova_compute[187243]: 2025-12-03 00:02:34.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:34 compute-0 nova_compute[187243]: 2025-12-03 00:02:34.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:34 compute-0 nova_compute[187243]: 2025-12-03 00:02:34.593 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:02:36 compute-0 nova_compute[187243]: 2025-12-03 00:02:36.098 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:36 compute-0 podman[213919]: 2025-12-03 00:02:36.102892072 +0000 UTC m=+0.057067661 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:02:36 compute-0 nova_compute[187243]: 2025-12-03 00:02:36.780 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:37 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:37.507 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:a1:44 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-25ca5bbc-e54a-44c7-ba31-d417797e7df1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25ca5bbc-e54a-44c7-ba31-d417797e7df1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=60db874a-6799-4e4b-b253-d9de0d5108a2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fcadb6ca-198d-432c-87c8-ff78b1242ce1) old=Port_Binding(mac=['fa:16:3e:a3:a1:44'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-25ca5bbc-e54a-44c7-ba31-d417797e7df1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25ca5bbc-e54a-44c7-ba31-d417797e7df1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:02:37 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:37.508 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fcadb6ca-198d-432c-87c8-ff78b1242ce1 in datapath 25ca5bbc-e54a-44c7-ba31-d417797e7df1 updated
Dec 03 00:02:37 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:37.509 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 25ca5bbc-e54a-44c7-ba31-d417797e7df1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:02:37 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:02:37.510 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8da93041-0de0-4d5e-ac8e-a7c5c0e315d0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:02:37 compute-0 nova_compute[187243]: 2025-12-03 00:02:37.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:37 compute-0 nova_compute[187243]: 2025-12-03 00:02:37.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:37 compute-0 nova_compute[187243]: 2025-12-03 00:02:37.796 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:39 compute-0 podman[213943]: 2025-12-03 00:02:39.085281248 +0000 UTC m=+0.047354453 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:02:39 compute-0 nova_compute[187243]: 2025-12-03 00:02:39.098 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:39 compute-0 nova_compute[187243]: 2025-12-03 00:02:39.098 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:39 compute-0 podman[213944]: 2025-12-03 00:02:39.151187803 +0000 UTC m=+0.109925997 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 03 00:02:39 compute-0 nova_compute[187243]: 2025-12-03 00:02:39.616 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:02:39 compute-0 nova_compute[187243]: 2025-12-03 00:02:39.616 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:02:39 compute-0 nova_compute[187243]: 2025-12-03 00:02:39.616 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:02:39 compute-0 nova_compute[187243]: 2025-12-03 00:02:39.616 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:02:39 compute-0 nova_compute[187243]: 2025-12-03 00:02:39.733 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:02:39 compute-0 nova_compute[187243]: 2025-12-03 00:02:39.734 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:02:39 compute-0 nova_compute[187243]: 2025-12-03 00:02:39.750 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:02:39 compute-0 nova_compute[187243]: 2025-12-03 00:02:39.751 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5821MB free_disk=73.16495132446289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:02:39 compute-0 nova_compute[187243]: 2025-12-03 00:02:39.751 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:02:39 compute-0 nova_compute[187243]: 2025-12-03 00:02:39.751 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:02:40 compute-0 nova_compute[187243]: 2025-12-03 00:02:40.789 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:02:40 compute-0 nova_compute[187243]: 2025-12-03 00:02:40.789 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:02:39 up  1:10,  0 user,  load average: 0.04, 0.22, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:02:40 compute-0 nova_compute[187243]: 2025-12-03 00:02:40.804 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:02:41 compute-0 nova_compute[187243]: 2025-12-03 00:02:41.312 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:02:41 compute-0 nova_compute[187243]: 2025-12-03 00:02:41.824 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:41 compute-0 nova_compute[187243]: 2025-12-03 00:02:41.827 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:02:41 compute-0 nova_compute[187243]: 2025-12-03 00:02:41.828 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.077s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:02:41 compute-0 nova_compute[187243]: 2025-12-03 00:02:41.828 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:41 compute-0 nova_compute[187243]: 2025-12-03 00:02:41.828 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:02:42 compute-0 nova_compute[187243]: 2025-12-03 00:02:42.386 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:02:42 compute-0 nova_compute[187243]: 2025-12-03 00:02:42.797 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:43 compute-0 nova_compute[187243]: 2025-12-03 00:02:43.880 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:45 compute-0 nova_compute[187243]: 2025-12-03 00:02:45.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:02:46 compute-0 nova_compute[187243]: 2025-12-03 00:02:46.826 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:47 compute-0 nova_compute[187243]: 2025-12-03 00:02:47.630 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:02:47 compute-0 nova_compute[187243]: 2025-12-03 00:02:47.630 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:02:47 compute-0 nova_compute[187243]: 2025-12-03 00:02:47.853 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:48 compute-0 nova_compute[187243]: 2025-12-03 00:02:48.136 187247 DEBUG nova.compute.manager [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:02:48 compute-0 nova_compute[187243]: 2025-12-03 00:02:48.686 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:02:48 compute-0 nova_compute[187243]: 2025-12-03 00:02:48.686 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:02:48 compute-0 nova_compute[187243]: 2025-12-03 00:02:48.692 187247 DEBUG nova.virt.hardware [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:02:48 compute-0 nova_compute[187243]: 2025-12-03 00:02:48.692 187247 INFO nova.compute.claims [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:02:50 compute-0 nova_compute[187243]: 2025-12-03 00:02:50.558 187247 DEBUG nova.compute.provider_tree [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:02:51 compute-0 nova_compute[187243]: 2025-12-03 00:02:51.074 187247 DEBUG nova.scheduler.client.report [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:02:51 compute-0 nova_compute[187243]: 2025-12-03 00:02:51.585 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.899s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:02:51 compute-0 nova_compute[187243]: 2025-12-03 00:02:51.586 187247 DEBUG nova.compute.manager [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:02:51 compute-0 nova_compute[187243]: 2025-12-03 00:02:51.827 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:52 compute-0 nova_compute[187243]: 2025-12-03 00:02:52.099 187247 DEBUG nova.compute.manager [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:02:52 compute-0 nova_compute[187243]: 2025-12-03 00:02:52.100 187247 DEBUG nova.network.neutron [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:02:52 compute-0 nova_compute[187243]: 2025-12-03 00:02:52.100 187247 WARNING neutronclient.v2_0.client [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:02:52 compute-0 nova_compute[187243]: 2025-12-03 00:02:52.100 187247 WARNING neutronclient.v2_0.client [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:02:52 compute-0 podman[213990]: 2025-12-03 00:02:52.125621329 +0000 UTC m=+0.075159381 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 03 00:02:52 compute-0 nova_compute[187243]: 2025-12-03 00:02:52.610 187247 INFO nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:02:52 compute-0 nova_compute[187243]: 2025-12-03 00:02:52.855 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:53 compute-0 nova_compute[187243]: 2025-12-03 00:02:53.119 187247 DEBUG nova.compute.manager [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.155 187247 DEBUG nova.compute.manager [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.157 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.158 187247 INFO nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Creating image(s)
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.159 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.159 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.161 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.162 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.168 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.170 187247 DEBUG oslo_concurrency.processutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.258 187247 DEBUG oslo_concurrency.processutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.260 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.261 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.262 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.269 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.270 187247 DEBUG oslo_concurrency.processutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:02:54 compute-0 sshd-session[214011]: Invalid user userb from 49.247.36.49 port 52314
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.361 187247 DEBUG oslo_concurrency.processutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.362 187247 DEBUG oslo_concurrency.processutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.397 187247 DEBUG oslo_concurrency.processutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.398 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.137s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.399 187247 DEBUG oslo_concurrency.processutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.460 187247 DEBUG oslo_concurrency.processutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.461 187247 DEBUG nova.virt.disk.api [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Checking if we can resize image /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.461 187247 DEBUG oslo_concurrency.processutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:02:54 compute-0 sshd-session[214011]: Received disconnect from 49.247.36.49 port 52314:11: Bye Bye [preauth]
Dec 03 00:02:54 compute-0 sshd-session[214011]: Disconnected from invalid user userb 49.247.36.49 port 52314 [preauth]
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.520 187247 DEBUG oslo_concurrency.processutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.521 187247 DEBUG nova.virt.disk.api [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Cannot resize image /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.521 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.522 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Ensure instance console log exists: /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.522 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.523 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.523 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:02:54 compute-0 nova_compute[187243]: 2025-12-03 00:02:54.768 187247 DEBUG nova.network.neutron [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Successfully created port: 7b8033d7-6209-4ba1-8605-72623902a9a9 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:02:55 compute-0 nova_compute[187243]: 2025-12-03 00:02:55.375 187247 DEBUG nova.network.neutron [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Successfully updated port: 7b8033d7-6209-4ba1-8605-72623902a9a9 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:02:55 compute-0 nova_compute[187243]: 2025-12-03 00:02:55.435 187247 DEBUG nova.compute.manager [req-48cac364-120c-4e54-98b5-0c760cd82e52 req-243e0c48-b6be-4a31-8ccb-b05feba990e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-changed-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:02:55 compute-0 nova_compute[187243]: 2025-12-03 00:02:55.435 187247 DEBUG nova.compute.manager [req-48cac364-120c-4e54-98b5-0c760cd82e52 req-243e0c48-b6be-4a31-8ccb-b05feba990e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Refreshing instance network info cache due to event network-changed-7b8033d7-6209-4ba1-8605-72623902a9a9. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:02:55 compute-0 nova_compute[187243]: 2025-12-03 00:02:55.436 187247 DEBUG oslo_concurrency.lockutils [req-48cac364-120c-4e54-98b5-0c760cd82e52 req-243e0c48-b6be-4a31-8ccb-b05feba990e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:02:55 compute-0 nova_compute[187243]: 2025-12-03 00:02:55.436 187247 DEBUG oslo_concurrency.lockutils [req-48cac364-120c-4e54-98b5-0c760cd82e52 req-243e0c48-b6be-4a31-8ccb-b05feba990e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:02:55 compute-0 nova_compute[187243]: 2025-12-03 00:02:55.437 187247 DEBUG nova.network.neutron [req-48cac364-120c-4e54-98b5-0c760cd82e52 req-243e0c48-b6be-4a31-8ccb-b05feba990e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Refreshing network info cache for port 7b8033d7-6209-4ba1-8605-72623902a9a9 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:02:55 compute-0 nova_compute[187243]: 2025-12-03 00:02:55.883 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:02:55 compute-0 nova_compute[187243]: 2025-12-03 00:02:55.944 187247 WARNING neutronclient.v2_0.client [req-48cac364-120c-4e54-98b5-0c760cd82e52 req-243e0c48-b6be-4a31-8ccb-b05feba990e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:02:56 compute-0 nova_compute[187243]: 2025-12-03 00:02:56.349 187247 DEBUG nova.network.neutron [req-48cac364-120c-4e54-98b5-0c760cd82e52 req-243e0c48-b6be-4a31-8ccb-b05feba990e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:02:56 compute-0 nova_compute[187243]: 2025-12-03 00:02:56.576 187247 DEBUG nova.network.neutron [req-48cac364-120c-4e54-98b5-0c760cd82e52 req-243e0c48-b6be-4a31-8ccb-b05feba990e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:02:56 compute-0 nova_compute[187243]: 2025-12-03 00:02:56.829 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:57 compute-0 nova_compute[187243]: 2025-12-03 00:02:57.084 187247 DEBUG oslo_concurrency.lockutils [req-48cac364-120c-4e54-98b5-0c760cd82e52 req-243e0c48-b6be-4a31-8ccb-b05feba990e2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:02:57 compute-0 nova_compute[187243]: 2025-12-03 00:02:57.085 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquired lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:02:57 compute-0 nova_compute[187243]: 2025-12-03 00:02:57.085 187247 DEBUG nova.network.neutron [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:02:57 compute-0 podman[214028]: 2025-12-03 00:02:57.0920427 +0000 UTC m=+0.053220837 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_id=multipathd)
Dec 03 00:02:57 compute-0 nova_compute[187243]: 2025-12-03 00:02:57.857 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:57 compute-0 nova_compute[187243]: 2025-12-03 00:02:57.918 187247 DEBUG nova.network.neutron [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.163 187247 WARNING neutronclient.v2_0.client [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.346 187247 DEBUG nova.network.neutron [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Updating instance_info_cache with network_info: [{"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.858 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Releasing lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.858 187247 DEBUG nova.compute.manager [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Instance network_info: |[{"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.860 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Start _get_guest_xml network_info=[{"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.864 187247 WARNING nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.865 187247 DEBUG nova.virt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-233597543', uuid='1c6c7975-72fd-442a-b75f-0baede84a60b'), owner=OwnerMeta(userid='ab182b4a69794d1fa103fbd3d503df99', username='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin', projectid='85e2f91a92cf4b5a9d626e8418f17322', projectname='tempest-TestExecuteHostMaintenanceStrategy-1767783627'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720178.8655365) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.870 187247 DEBUG nova.virt.libvirt.host [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.871 187247 DEBUG nova.virt.libvirt.host [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.873 187247 DEBUG nova.virt.libvirt.host [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.874 187247 DEBUG nova.virt.libvirt.host [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.875 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.875 187247 DEBUG nova.virt.hardware [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.875 187247 DEBUG nova.virt.hardware [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.875 187247 DEBUG nova.virt.hardware [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.876 187247 DEBUG nova.virt.hardware [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.876 187247 DEBUG nova.virt.hardware [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.876 187247 DEBUG nova.virt.hardware [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.876 187247 DEBUG nova.virt.hardware [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.876 187247 DEBUG nova.virt.hardware [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.876 187247 DEBUG nova.virt.hardware [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.877 187247 DEBUG nova.virt.hardware [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.877 187247 DEBUG nova.virt.hardware [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.880 187247 DEBUG nova.virt.libvirt.vif [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:02:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-233597543',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-233597543',id=12,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-3he48fro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:02:53Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=1c6c7975-72fd-442a-b75f-0baede84a60b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.880 187247 DEBUG nova.network.os_vif_util [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converting VIF {"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.881 187247 DEBUG nova.network.os_vif_util [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:02:58 compute-0 nova_compute[187243]: 2025-12-03 00:02:58.882 187247 DEBUG nova.objects.instance [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1c6c7975-72fd-442a-b75f-0baede84a60b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.392 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:02:59 compute-0 nova_compute[187243]:   <uuid>1c6c7975-72fd-442a-b75f-0baede84a60b</uuid>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   <name>instance-0000000c</name>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-233597543</nova:name>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:02:58</nova:creationTime>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:02:59 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:02:59 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:user uuid="ab182b4a69794d1fa103fbd3d503df99">tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin</nova:user>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:project uuid="85e2f91a92cf4b5a9d626e8418f17322">tempest-TestExecuteHostMaintenanceStrategy-1767783627</nova:project>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         <nova:port uuid="7b8033d7-6209-4ba1-8605-72623902a9a9">
Dec 03 00:02:59 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <system>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <entry name="serial">1c6c7975-72fd-442a-b75f-0baede84a60b</entry>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <entry name="uuid">1c6c7975-72fd-442a-b75f-0baede84a60b</entry>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     </system>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   <os>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   </os>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   <features>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   </features>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk.config"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:68:79:f2"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <target dev="tap7b8033d7-62"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/console.log" append="off"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <video>
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     </video>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:02:59 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:02:59 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:02:59 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:02:59 compute-0 nova_compute[187243]: </domain>
Dec 03 00:02:59 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.394 187247 DEBUG nova.compute.manager [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Preparing to wait for external event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.394 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.395 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.395 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.396 187247 DEBUG nova.virt.libvirt.vif [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:02:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-233597543',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-233597543',id=12,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-3he48fro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:02:53Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=1c6c7975-72fd-442a-b75f-0baede84a60b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.397 187247 DEBUG nova.network.os_vif_util [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converting VIF {"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.398 187247 DEBUG nova.network.os_vif_util [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.398 187247 DEBUG os_vif [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.399 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.399 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.400 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.401 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.402 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '100969f5-aa2e-5b48-a96a-4a62dae66314', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.403 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.406 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.409 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.409 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b8033d7-62, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.410 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap7b8033d7-62, col_values=(('qos', UUID('ba6e8c6e-bd30-4228-8b86-a14985303c90')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.411 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap7b8033d7-62, col_values=(('external_ids', {'iface-id': '7b8033d7-6209-4ba1-8605-72623902a9a9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:68:79:f2', 'vm-uuid': '1c6c7975-72fd-442a-b75f-0baede84a60b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.412 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:59 compute-0 NetworkManager[55671]: <info>  [1764720179.4135] manager: (tap7b8033d7-62): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.415 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.420 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:02:59 compute-0 nova_compute[187243]: 2025-12-03 00:02:59.421 187247 INFO os_vif [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62')
Dec 03 00:02:59 compute-0 podman[197600]: time="2025-12-03T00:02:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:02:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:02:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:02:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:02:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2592 "" "Go-http-client/1.1"
Dec 03 00:03:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:00.690 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:00.690 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:00.690 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:00 compute-0 nova_compute[187243]: 2025-12-03 00:03:00.967 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:03:00 compute-0 nova_compute[187243]: 2025-12-03 00:03:00.968 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:03:00 compute-0 nova_compute[187243]: 2025-12-03 00:03:00.968 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] No VIF found with MAC fa:16:3e:68:79:f2, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:03:00 compute-0 nova_compute[187243]: 2025-12-03 00:03:00.968 187247 INFO nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Using config drive
Dec 03 00:03:01 compute-0 sshd-session[214050]: Invalid user deploy from 102.210.148.92 port 47620
Dec 03 00:03:01 compute-0 openstack_network_exporter[199746]: ERROR   00:03:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:03:01 compute-0 openstack_network_exporter[199746]: ERROR   00:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:03:01 compute-0 openstack_network_exporter[199746]: ERROR   00:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:03:01 compute-0 openstack_network_exporter[199746]: ERROR   00:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:03:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:03:01 compute-0 openstack_network_exporter[199746]: ERROR   00:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:03:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:03:01 compute-0 nova_compute[187243]: 2025-12-03 00:03:01.478 187247 WARNING neutronclient.v2_0.client [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:01 compute-0 sshd-session[214050]: Received disconnect from 102.210.148.92 port 47620:11: Bye Bye [preauth]
Dec 03 00:03:01 compute-0 sshd-session[214050]: Disconnected from invalid user deploy 102.210.148.92 port 47620 [preauth]
Dec 03 00:03:01 compute-0 nova_compute[187243]: 2025-12-03 00:03:01.677 187247 INFO nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Creating config drive at /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk.config
Dec 03 00:03:01 compute-0 nova_compute[187243]: 2025-12-03 00:03:01.681 187247 DEBUG oslo_concurrency.processutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpg5ujtt5t execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:01 compute-0 nova_compute[187243]: 2025-12-03 00:03:01.805 187247 DEBUG oslo_concurrency.processutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpg5ujtt5t" returned: 0 in 0.123s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:01 compute-0 nova_compute[187243]: 2025-12-03 00:03:01.832 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:01 compute-0 kernel: tap7b8033d7-62: entered promiscuous mode
Dec 03 00:03:01 compute-0 NetworkManager[55671]: <info>  [1764720181.8668] manager: (tap7b8033d7-62): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Dec 03 00:03:01 compute-0 ovn_controller[95488]: 2025-12-03T00:03:01Z|00102|binding|INFO|Claiming lport 7b8033d7-6209-4ba1-8605-72623902a9a9 for this chassis.
Dec 03 00:03:01 compute-0 ovn_controller[95488]: 2025-12-03T00:03:01Z|00103|binding|INFO|7b8033d7-6209-4ba1-8605-72623902a9a9: Claiming fa:16:3e:68:79:f2 10.100.0.14
Dec 03 00:03:01 compute-0 nova_compute[187243]: 2025-12-03 00:03:01.868 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:01 compute-0 nova_compute[187243]: 2025-12-03 00:03:01.877 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:01.882 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:79:f2 10.100.0.14'], port_security=['fa:16:3e:68:79:f2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1c6c7975-72fd-442a-b75f-0baede84a60b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2256d612-5a1d-4528-93f3-139a5d1ff76a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=7b8033d7-6209-4ba1-8605-72623902a9a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:03:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:01.883 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 7b8033d7-6209-4ba1-8605-72623902a9a9 in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb bound to our chassis
Dec 03 00:03:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:01.884 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:03:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:01.895 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[11973e35-e092-4375-b727-0f9cf3340314]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:01.896 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH taped11b71b-71 in ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:03:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:01.899 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface taped11b71b-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:03:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:01.899 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4fed0ccb-dc99-456c-8cfd-19952a7b150a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:01.900 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[deca2f80-1893-4744-8a94-ba5b4669fb2b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:01 compute-0 systemd-machined[153518]: New machine qemu-8-instance-0000000c.
Dec 03 00:03:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:01.909 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[4253b36b-7c3f-4351-8804-22997dded198]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:01.945 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9bc83d32-9d29-4bba-8d79-2949b0b66682]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:01 compute-0 ovn_controller[95488]: 2025-12-03T00:03:01Z|00104|binding|INFO|Setting lport 7b8033d7-6209-4ba1-8605-72623902a9a9 ovn-installed in OVS
Dec 03 00:03:01 compute-0 ovn_controller[95488]: 2025-12-03T00:03:01Z|00105|binding|INFO|Setting lport 7b8033d7-6209-4ba1-8605-72623902a9a9 up in Southbound
Dec 03 00:03:01 compute-0 nova_compute[187243]: 2025-12-03 00:03:01.950 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:01 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000c.
Dec 03 00:03:01 compute-0 systemd-udevd[214076]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:03:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:01.975 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[995607f5-bc34-4b50-923d-7eacd3872855]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:01 compute-0 NetworkManager[55671]: <info>  [1764720181.9812] device (tap7b8033d7-62): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:03:01 compute-0 NetworkManager[55671]: <info>  [1764720181.9831] manager: (taped11b71b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Dec 03 00:03:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:01.982 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7ce43952-9648-4884-a50b-efffd7d4c15c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:01 compute-0 systemd-udevd[214079]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:03:01 compute-0 NetworkManager[55671]: <info>  [1764720181.9835] device (tap7b8033d7-62): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.008 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[bf373abb-7828-465f-a6a9-b80110065970]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.011 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[4e54b600-5edb-4be5-983d-1d53f0ff22cd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:02 compute-0 NetworkManager[55671]: <info>  [1764720182.0350] device (taped11b71b-70): carrier: link connected
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.041 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[8f18ddf5-42c2-4639-bca1-19f9c69b3b92]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.058 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[661cbcb5-d2a9-4df1-9606-bc92dedd49d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped11b71b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:bb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 427267, 'reachable_time': 44984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214108, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.075 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d1f9a28d-9c92-42e3-9784-70471872d6e4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6d:bbc6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 427267, 'tstamp': 427267}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214109, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.092 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[03e0c202-3c9f-4c54-b10d-82d60e85dd71]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped11b71b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:bb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 427267, 'reachable_time': 44984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214110, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.123 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[76ff1d99-5ac6-408a-975d-f4dc731fa2e2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.185 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b45bfd05-0d64-4c06-9830-92fb09fee797]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.187 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped11b71b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.187 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.188 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped11b71b-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.189 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:02 compute-0 kernel: taped11b71b-70: entered promiscuous mode
Dec 03 00:03:02 compute-0 NetworkManager[55671]: <info>  [1764720182.1907] manager: (taped11b71b-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.192 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.193 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped11b71b-70, col_values=(('external_ids', {'iface-id': 'add6ea4f-8836-4bed-8f1e-39e943ccf4b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:02 compute-0 ovn_controller[95488]: 2025-12-03T00:03:02Z|00106|binding|INFO|Releasing lport add6ea4f-8836-4bed-8f1e-39e943ccf4b5 from this chassis (sb_readonly=0)
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.194 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.196 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1bfa3622-fdca-49f1-98e5-5f4d18d158a3]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.196 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.196 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.196 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for ed11b71b-745b-4f0c-9f09-37d53d166bcb disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.197 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.197 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d32e858a-47f6-460c-ba9e-2c5ac7051a79]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.198 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.198 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c0f30b7a-45e9-4aed-914f-667765a9e504]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.199 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: global
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: defaults
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     log global
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:03:02 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:02.199 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'env', 'PROCESS_TAG=haproxy-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ed11b71b-745b-4f0c-9f09-37d53d166bcb.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.207 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.447 187247 DEBUG nova.compute.manager [req-56873bdd-cac4-41fb-9f31-555753269532 req-9e9c49f6-7b27-471c-9ca2-9d091dda79e5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.448 187247 DEBUG oslo_concurrency.lockutils [req-56873bdd-cac4-41fb-9f31-555753269532 req-9e9c49f6-7b27-471c-9ca2-9d091dda79e5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.448 187247 DEBUG oslo_concurrency.lockutils [req-56873bdd-cac4-41fb-9f31-555753269532 req-9e9c49f6-7b27-471c-9ca2-9d091dda79e5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.448 187247 DEBUG oslo_concurrency.lockutils [req-56873bdd-cac4-41fb-9f31-555753269532 req-9e9c49f6-7b27-471c-9ca2-9d091dda79e5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.449 187247 DEBUG nova.compute.manager [req-56873bdd-cac4-41fb-9f31-555753269532 req-9e9c49f6-7b27-471c-9ca2-9d091dda79e5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Processing event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.449 187247 DEBUG nova.compute.manager [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.453 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.455 187247 INFO nova.virt.libvirt.driver [-] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Instance spawned successfully.
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.456 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:03:02 compute-0 sshd-session[214106]: Invalid user nominatim from 20.123.120.169 port 57910
Dec 03 00:03:02 compute-0 podman[214149]: 2025-12-03 00:03:02.553845615 +0000 UTC m=+0.048569384 container create cd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Dec 03 00:03:02 compute-0 systemd[1]: Started libpod-conmon-cd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba.scope.
Dec 03 00:03:02 compute-0 podman[214149]: 2025-12-03 00:03:02.528620741 +0000 UTC m=+0.023344540 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:03:02 compute-0 sshd-session[214106]: Received disconnect from 20.123.120.169 port 57910:11: Bye Bye [preauth]
Dec 03 00:03:02 compute-0 systemd[1]: Started libcrun container.
Dec 03 00:03:02 compute-0 sshd-session[214106]: Disconnected from invalid user nominatim 20.123.120.169 port 57910 [preauth]
Dec 03 00:03:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a462ad89309f76660424566f2c45ae6051283bedf9bdba4f8df56094e8a9564/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:03:02 compute-0 podman[214149]: 2025-12-03 00:03:02.649393652 +0000 UTC m=+0.144117471 container init cd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:03:02 compute-0 podman[214149]: 2025-12-03 00:03:02.655739066 +0000 UTC m=+0.150462845 container start cd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 00:03:02 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214164]: [NOTICE]   (214168) : New worker (214170) forked
Dec 03 00:03:02 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214164]: [NOTICE]   (214168) : Loading success.
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.970 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.971 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.972 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.972 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.973 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:03:02 compute-0 nova_compute[187243]: 2025-12-03 00:03:02.974 187247 DEBUG nova.virt.libvirt.driver [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:03:03 compute-0 nova_compute[187243]: 2025-12-03 00:03:03.485 187247 INFO nova.compute.manager [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Took 9.33 seconds to spawn the instance on the hypervisor.
Dec 03 00:03:03 compute-0 nova_compute[187243]: 2025-12-03 00:03:03.486 187247 DEBUG nova.compute.manager [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:03:04 compute-0 nova_compute[187243]: 2025-12-03 00:03:04.028 187247 INFO nova.compute.manager [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Took 15.38 seconds to build instance.
Dec 03 00:03:04 compute-0 nova_compute[187243]: 2025-12-03 00:03:04.413 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:04 compute-0 nova_compute[187243]: 2025-12-03 00:03:04.517 187247 DEBUG nova.compute.manager [req-41f7b06b-748a-4f8d-a786-d886b57d462b req-470369de-8acd-4442-927a-67a653e0ae29 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:04 compute-0 nova_compute[187243]: 2025-12-03 00:03:04.518 187247 DEBUG oslo_concurrency.lockutils [req-41f7b06b-748a-4f8d-a786-d886b57d462b req-470369de-8acd-4442-927a-67a653e0ae29 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:04 compute-0 nova_compute[187243]: 2025-12-03 00:03:04.518 187247 DEBUG oslo_concurrency.lockutils [req-41f7b06b-748a-4f8d-a786-d886b57d462b req-470369de-8acd-4442-927a-67a653e0ae29 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:04 compute-0 nova_compute[187243]: 2025-12-03 00:03:04.518 187247 DEBUG oslo_concurrency.lockutils [req-41f7b06b-748a-4f8d-a786-d886b57d462b req-470369de-8acd-4442-927a-67a653e0ae29 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:04 compute-0 nova_compute[187243]: 2025-12-03 00:03:04.518 187247 DEBUG nova.compute.manager [req-41f7b06b-748a-4f8d-a786-d886b57d462b req-470369de-8acd-4442-927a-67a653e0ae29 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] No waiting events found dispatching network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:03:04 compute-0 nova_compute[187243]: 2025-12-03 00:03:04.519 187247 WARNING nova.compute.manager [req-41f7b06b-748a-4f8d-a786-d886b57d462b req-470369de-8acd-4442-927a-67a653e0ae29 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received unexpected event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 for instance with vm_state active and task_state None.
Dec 03 00:03:04 compute-0 nova_compute[187243]: 2025-12-03 00:03:04.534 187247 DEBUG oslo_concurrency.lockutils [None req-42ac602d-0490-4a80-b19c-83db12b1def4 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.903s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:06 compute-0 nova_compute[187243]: 2025-12-03 00:03:06.834 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:07 compute-0 podman[214179]: 2025-12-03 00:03:07.14467142 +0000 UTC m=+0.094132834 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:03:09 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 03 00:03:09 compute-0 podman[214205]: 2025-12-03 00:03:09.247448807 +0000 UTC m=+0.048161644 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 00:03:09 compute-0 podman[214206]: 2025-12-03 00:03:09.31039462 +0000 UTC m=+0.112010759 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec 03 00:03:09 compute-0 nova_compute[187243]: 2025-12-03 00:03:09.415 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:11 compute-0 nova_compute[187243]: 2025-12-03 00:03:11.882 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:13 compute-0 sshd[128750]: Timeout before authentication for connection from 101.47.140.127 to 38.102.83.77, pid = 213509
Dec 03 00:03:14 compute-0 nova_compute[187243]: 2025-12-03 00:03:14.417 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:14 compute-0 nova_compute[187243]: 2025-12-03 00:03:14.516 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:14 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:14.516 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:03:14 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:14.517 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:03:14 compute-0 ovn_controller[95488]: 2025-12-03T00:03:14Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:68:79:f2 10.100.0.14
Dec 03 00:03:14 compute-0 ovn_controller[95488]: 2025-12-03T00:03:14Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:68:79:f2 10.100.0.14
Dec 03 00:03:16 compute-0 nova_compute[187243]: 2025-12-03 00:03:16.885 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:17.519 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:19 compute-0 nova_compute[187243]: 2025-12-03 00:03:19.420 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:21 compute-0 nova_compute[187243]: 2025-12-03 00:03:21.886 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:23 compute-0 podman[214272]: 2025-12-03 00:03:23.128524199 +0000 UTC m=+0.082502841 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public)
Dec 03 00:03:24 compute-0 nova_compute[187243]: 2025-12-03 00:03:24.424 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:26 compute-0 nova_compute[187243]: 2025-12-03 00:03:26.487 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:26 compute-0 nova_compute[187243]: 2025-12-03 00:03:26.888 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:26 compute-0 nova_compute[187243]: 2025-12-03 00:03:26.995 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Triggering sync for uuid 1c6c7975-72fd-442a-b75f-0baede84a60b _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Dec 03 00:03:26 compute-0 nova_compute[187243]: 2025-12-03 00:03:26.995 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:26 compute-0 nova_compute[187243]: 2025-12-03 00:03:26.995 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:27 compute-0 nova_compute[187243]: 2025-12-03 00:03:27.511 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.515s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:28 compute-0 podman[214296]: 2025-12-03 00:03:28.113421199 +0000 UTC m=+0.065887366 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:03:29 compute-0 sshd-session[214294]: Invalid user syncthing from 45.78.219.95 port 46836
Dec 03 00:03:29 compute-0 nova_compute[187243]: 2025-12-03 00:03:29.426 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:29 compute-0 podman[197600]: time="2025-12-03T00:03:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:03:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:03:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:03:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:03:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3055 "" "Go-http-client/1.1"
Dec 03 00:03:29 compute-0 sshd-session[214294]: Received disconnect from 45.78.219.95 port 46836:11: Bye Bye [preauth]
Dec 03 00:03:29 compute-0 sshd-session[214294]: Disconnected from invalid user syncthing 45.78.219.95 port 46836 [preauth]
Dec 03 00:03:30 compute-0 nova_compute[187243]: 2025-12-03 00:03:30.889 187247 DEBUG nova.compute.manager [None req-61a42a30-ebf8-446a-8956-2754dc6c7bf8 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Dec 03 00:03:30 compute-0 nova_compute[187243]: 2025-12-03 00:03:30.955 187247 DEBUG nova.compute.provider_tree [None req-61a42a30-ebf8-446a-8956-2754dc6c7bf8 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Updating resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 generation from 16 to 18 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 03 00:03:31 compute-0 openstack_network_exporter[199746]: ERROR   00:03:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:03:31 compute-0 openstack_network_exporter[199746]: ERROR   00:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:03:31 compute-0 openstack_network_exporter[199746]: ERROR   00:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:03:31 compute-0 openstack_network_exporter[199746]: ERROR   00:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:03:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:03:31 compute-0 openstack_network_exporter[199746]: ERROR   00:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:03:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:03:31 compute-0 nova_compute[187243]: 2025-12-03 00:03:31.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:31 compute-0 nova_compute[187243]: 2025-12-03 00:03:31.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:31 compute-0 nova_compute[187243]: 2025-12-03 00:03:31.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:03:31 compute-0 nova_compute[187243]: 2025-12-03 00:03:31.892 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:32 compute-0 sshd-session[214316]: Invalid user sales1 from 61.220.235.10 port 35778
Dec 03 00:03:33 compute-0 sshd-session[214316]: Received disconnect from 61.220.235.10 port 35778:11: Bye Bye [preauth]
Dec 03 00:03:33 compute-0 sshd-session[214316]: Disconnected from invalid user sales1 61.220.235.10 port 35778 [preauth]
Dec 03 00:03:34 compute-0 nova_compute[187243]: 2025-12-03 00:03:34.429 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:36 compute-0 sshd[128750]: drop connection #0 from [101.47.140.127]:54062 on [38.102.83.77]:22 penalty: exceeded LoginGraceTime
Dec 03 00:03:36 compute-0 nova_compute[187243]: 2025-12-03 00:03:36.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:36 compute-0 nova_compute[187243]: 2025-12-03 00:03:36.892 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:37 compute-0 nova_compute[187243]: 2025-12-03 00:03:37.587 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:37 compute-0 nova_compute[187243]: 2025-12-03 00:03:37.815 187247 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Check if temp file /var/lib/nova/instances/tmpzkdk_x_6 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:03:37 compute-0 nova_compute[187243]: 2025-12-03 00:03:37.819 187247 DEBUG nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzkdk_x_6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c6c7975-72fd-442a-b75f-0baede84a60b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:03:38 compute-0 podman[214318]: 2025-12-03 00:03:38.100504693 +0000 UTC m=+0.051373312 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:03:38 compute-0 nova_compute[187243]: 2025-12-03 00:03:38.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:39 compute-0 nova_compute[187243]: 2025-12-03 00:03:39.432 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:40 compute-0 podman[214342]: 2025-12-03 00:03:40.099961242 +0000 UTC m=+0.058584757 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 03 00:03:40 compute-0 podman[214343]: 2025-12-03 00:03:40.131791737 +0000 UTC m=+0.083925324 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Dec 03 00:03:40 compute-0 nova_compute[187243]: 2025-12-03 00:03:40.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:40 compute-0 nova_compute[187243]: 2025-12-03 00:03:40.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:40 compute-0 nova_compute[187243]: 2025-12-03 00:03:40.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:03:41 compute-0 nova_compute[187243]: 2025-12-03 00:03:41.112 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:41 compute-0 nova_compute[187243]: 2025-12-03 00:03:41.112 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:41 compute-0 nova_compute[187243]: 2025-12-03 00:03:41.113 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:41 compute-0 nova_compute[187243]: 2025-12-03 00:03:41.113 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:03:41 compute-0 nova_compute[187243]: 2025-12-03 00:03:41.950 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.199 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.262 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.263 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.326 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.454 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.455 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.477 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.479 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5667MB free_disk=73.13624572753906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.479 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.480 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.681 187247 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.743 187247 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.745 187247 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.836 187247 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.838 187247 DEBUG nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Preparing to wait for external event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.839 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.840 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:42 compute-0 nova_compute[187243]: 2025-12-03 00:03:42.840 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:43 compute-0 nova_compute[187243]: 2025-12-03 00:03:43.510 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Updating resource usage from migration dcc97c02-db8d-4b14-b48e-41025617bbf0
Dec 03 00:03:43 compute-0 nova_compute[187243]: 2025-12-03 00:03:43.541 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration dcc97c02-db8d-4b14-b48e-41025617bbf0 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:03:43 compute-0 nova_compute[187243]: 2025-12-03 00:03:43.542 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:03:43 compute-0 nova_compute[187243]: 2025-12-03 00:03:43.542 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:03:42 up  1:11,  0 user,  load average: 0.11, 0.21, 0.34\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_85e2f91a92cf4b5a9d626e8418f17322': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:03:43 compute-0 nova_compute[187243]: 2025-12-03 00:03:43.694 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:03:44 compute-0 nova_compute[187243]: 2025-12-03 00:03:44.202 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:03:44 compute-0 nova_compute[187243]: 2025-12-03 00:03:44.435 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:44 compute-0 nova_compute[187243]: 2025-12-03 00:03:44.713 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:03:44 compute-0 nova_compute[187243]: 2025-12-03 00:03:44.714 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.234s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:46 compute-0 nova_compute[187243]: 2025-12-03 00:03:46.952 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:49 compute-0 nova_compute[187243]: 2025-12-03 00:03:49.043 187247 DEBUG nova.compute.manager [req-d2ec1823-022e-4c14-abce-5fea3cb18701 req-0b24197b-1e35-45db-93d4-853be5625998 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:49 compute-0 nova_compute[187243]: 2025-12-03 00:03:49.043 187247 DEBUG oslo_concurrency.lockutils [req-d2ec1823-022e-4c14-abce-5fea3cb18701 req-0b24197b-1e35-45db-93d4-853be5625998 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:49 compute-0 nova_compute[187243]: 2025-12-03 00:03:49.043 187247 DEBUG oslo_concurrency.lockutils [req-d2ec1823-022e-4c14-abce-5fea3cb18701 req-0b24197b-1e35-45db-93d4-853be5625998 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:49 compute-0 nova_compute[187243]: 2025-12-03 00:03:49.044 187247 DEBUG oslo_concurrency.lockutils [req-d2ec1823-022e-4c14-abce-5fea3cb18701 req-0b24197b-1e35-45db-93d4-853be5625998 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:49 compute-0 nova_compute[187243]: 2025-12-03 00:03:49.044 187247 DEBUG nova.compute.manager [req-d2ec1823-022e-4c14-abce-5fea3cb18701 req-0b24197b-1e35-45db-93d4-853be5625998 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] No event matching network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 in dict_keys([('network-vif-plugged', '7b8033d7-6209-4ba1-8605-72623902a9a9')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:03:49 compute-0 nova_compute[187243]: 2025-12-03 00:03:49.044 187247 DEBUG nova.compute.manager [req-d2ec1823-022e-4c14-abce-5fea3cb18701 req-0b24197b-1e35-45db-93d4-853be5625998 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:03:49 compute-0 sshd-session[214401]: Invalid user jenkins from 23.95.37.90 port 53192
Dec 03 00:03:49 compute-0 sshd-session[214401]: Received disconnect from 23.95.37.90 port 53192:11: Bye Bye [preauth]
Dec 03 00:03:49 compute-0 sshd-session[214401]: Disconnected from invalid user jenkins 23.95.37.90 port 53192 [preauth]
Dec 03 00:03:49 compute-0 nova_compute[187243]: 2025-12-03 00:03:49.439 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:49 compute-0 nova_compute[187243]: 2025-12-03 00:03:49.869 187247 INFO nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Took 7.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.089 187247 DEBUG nova.compute.manager [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.089 187247 DEBUG oslo_concurrency.lockutils [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.090 187247 DEBUG oslo_concurrency.lockutils [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.090 187247 DEBUG oslo_concurrency.lockutils [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.090 187247 DEBUG nova.compute.manager [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Processing event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.090 187247 DEBUG nova.compute.manager [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-changed-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.091 187247 DEBUG nova.compute.manager [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Refreshing instance network info cache due to event network-changed-7b8033d7-6209-4ba1-8605-72623902a9a9. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.091 187247 DEBUG oslo_concurrency.lockutils [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.091 187247 DEBUG oslo_concurrency.lockutils [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.091 187247 DEBUG nova.network.neutron [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Refreshing network info cache for port 7b8033d7-6209-4ba1-8605-72623902a9a9 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.093 187247 DEBUG nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:03:51 compute-0 sshd-session[214399]: Invalid user mark from 45.78.219.213 port 36422
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.600 187247 WARNING neutronclient.v2_0.client [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.603 187247 DEBUG nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzkdk_x_6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1c6c7975-72fd-442a-b75f-0baede84a60b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(dcc97c02-db8d-4b14-b48e-41025617bbf0),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:03:51 compute-0 nova_compute[187243]: 2025-12-03 00:03:51.954 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:51 compute-0 ovn_controller[95488]: 2025-12-03T00:03:51Z|00107|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.117 187247 DEBUG nova.objects.instance [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 1c6c7975-72fd-442a-b75f-0baede84a60b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.118 187247 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.119 187247 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.119 187247 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:03:52 compute-0 sshd-session[214399]: Received disconnect from 45.78.219.213 port 36422:11: Bye Bye [preauth]
Dec 03 00:03:52 compute-0 sshd-session[214399]: Disconnected from invalid user mark 45.78.219.213 port 36422 [preauth]
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.468 187247 WARNING neutronclient.v2_0.client [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.622 187247 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.622 187247 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.628 187247 DEBUG nova.virt.libvirt.vif [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:02:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-233597543',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-233597543',id=12,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:03:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-3he48fro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:03:03Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=1c6c7975-72fd-442a-b75f-0baede84a60b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.628 187247 DEBUG nova.network.os_vif_util [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.629 187247 DEBUG nova.network.os_vif_util [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.629 187247 DEBUG nova.virt.libvirt.migration [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:68:79:f2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <target dev="tap7b8033d7-62"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]: </interface>
Dec 03 00:03:52 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.630 187247 DEBUG nova.virt.libvirt.migration [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <name>instance-0000000c</name>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <uuid>1c6c7975-72fd-442a-b75f-0baede84a60b</uuid>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-233597543</nova:name>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:02:58</nova:creationTime>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:03:52 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:03:52 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:user uuid="ab182b4a69794d1fa103fbd3d503df99">tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin</nova:user>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:project uuid="85e2f91a92cf4b5a9d626e8418f17322">tempest-TestExecuteHostMaintenanceStrategy-1767783627</nova:project>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:port uuid="7b8033d7-6209-4ba1-8605-72623902a9a9">
Dec 03 00:03:52 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <system>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="serial">1c6c7975-72fd-442a-b75f-0baede84a60b</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="uuid">1c6c7975-72fd-442a-b75f-0baede84a60b</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </system>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <os>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </os>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <features>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </features>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk.config"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:68:79:f2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b8033d7-62"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/console.log" append="off"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </target>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/console.log" append="off"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </console>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </input>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <video>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </video>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]: </domain>
Dec 03 00:03:52 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.632 187247 DEBUG nova.virt.libvirt.migration [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <name>instance-0000000c</name>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <uuid>1c6c7975-72fd-442a-b75f-0baede84a60b</uuid>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-233597543</nova:name>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:02:58</nova:creationTime>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:03:52 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:03:52 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:user uuid="ab182b4a69794d1fa103fbd3d503df99">tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin</nova:user>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:project uuid="85e2f91a92cf4b5a9d626e8418f17322">tempest-TestExecuteHostMaintenanceStrategy-1767783627</nova:project>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:port uuid="7b8033d7-6209-4ba1-8605-72623902a9a9">
Dec 03 00:03:52 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <system>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="serial">1c6c7975-72fd-442a-b75f-0baede84a60b</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="uuid">1c6c7975-72fd-442a-b75f-0baede84a60b</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </system>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <os>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </os>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <features>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </features>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk.config"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:68:79:f2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b8033d7-62"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/console.log" append="off"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </target>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/console.log" append="off"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </console>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </input>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <video>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </video>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]: </domain>
Dec 03 00:03:52 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.634 187247 DEBUG nova.virt.libvirt.migration [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <name>instance-0000000c</name>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <uuid>1c6c7975-72fd-442a-b75f-0baede84a60b</uuid>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-233597543</nova:name>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:02:58</nova:creationTime>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:03:52 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:03:52 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:user uuid="ab182b4a69794d1fa103fbd3d503df99">tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin</nova:user>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:project uuid="85e2f91a92cf4b5a9d626e8418f17322">tempest-TestExecuteHostMaintenanceStrategy-1767783627</nova:project>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <nova:port uuid="7b8033d7-6209-4ba1-8605-72623902a9a9">
Dec 03 00:03:52 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <system>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="serial">1c6c7975-72fd-442a-b75f-0baede84a60b</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="uuid">1c6c7975-72fd-442a-b75f-0baede84a60b</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </system>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <os>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </os>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <features>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </features>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/disk.config"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:68:79:f2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap7b8033d7-62"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/console.log" append="off"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:03:52 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       </target>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b/console.log" append="off"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </console>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </input>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <video>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </video>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:03:52 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:03:52 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:03:52 compute-0 nova_compute[187243]: </domain>
Dec 03 00:03:52 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.635 187247 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.671 187247 DEBUG nova.network.neutron [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Updated VIF entry in instance network info cache for port 7b8033d7-6209-4ba1-8605-72623902a9a9. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:03:52 compute-0 nova_compute[187243]: 2025-12-03 00:03:52.672 187247 DEBUG nova.network.neutron [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Updating instance_info_cache with network_info: [{"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:03:53 compute-0 nova_compute[187243]: 2025-12-03 00:03:53.124 187247 DEBUG nova.virt.libvirt.migration [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:03:53 compute-0 nova_compute[187243]: 2025-12-03 00:03:53.124 187247 INFO nova.virt.libvirt.migration [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:03:53 compute-0 nova_compute[187243]: 2025-12-03 00:03:53.179 187247 DEBUG oslo_concurrency.lockutils [req-8a39b46c-cf85-4c37-a257-ff474ae43b97 req-039ffb6c-bf8e-4047-a68b-4f358d290938 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-1c6c7975-72fd-442a-b75f-0baede84a60b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:03:54 compute-0 podman[214413]: 2025-12-03 00:03:54.111170183 +0000 UTC m=+0.054450116 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.172 187247 INFO nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:03:54 compute-0 kernel: tap7b8033d7-62 (unregistering): left promiscuous mode
Dec 03 00:03:54 compute-0 NetworkManager[55671]: <info>  [1764720234.2002] device (tap7b8033d7-62): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:03:54 compute-0 ovn_controller[95488]: 2025-12-03T00:03:54Z|00108|binding|INFO|Releasing lport 7b8033d7-6209-4ba1-8605-72623902a9a9 from this chassis (sb_readonly=0)
Dec 03 00:03:54 compute-0 ovn_controller[95488]: 2025-12-03T00:03:54Z|00109|binding|INFO|Setting lport 7b8033d7-6209-4ba1-8605-72623902a9a9 down in Southbound
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.203 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:54 compute-0 ovn_controller[95488]: 2025-12-03T00:03:54Z|00110|binding|INFO|Removing iface tap7b8033d7-62 ovn-installed in OVS
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.206 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.214 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:79:f2 10.100.0.14'], port_security=['fa:16:3e:68:79:f2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1c6c7975-72fd-442a-b75f-0baede84a60b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '10', 'neutron:security_group_ids': '2256d612-5a1d-4528-93f3-139a5d1ff76a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=7b8033d7-6209-4ba1-8605-72623902a9a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.215 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 7b8033d7-6209-4ba1-8605-72623902a9a9 in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb unbound from our chassis
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.217 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed11b71b-745b-4f0c-9f09-37d53d166bcb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.218 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c0f67939-2643-425f-9d80-37a153240d98]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.219 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb namespace which is not needed anymore
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.222 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:54 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec 03 00:03:54 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000c.scope: Consumed 13.961s CPU time.
Dec 03 00:03:54 compute-0 systemd-machined[153518]: Machine qemu-8-instance-0000000c terminated.
Dec 03 00:03:54 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214164]: [NOTICE]   (214168) : haproxy version is 3.0.5-8e879a5
Dec 03 00:03:54 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214164]: [NOTICE]   (214168) : path to executable is /usr/sbin/haproxy
Dec 03 00:03:54 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214164]: [WARNING]  (214168) : Exiting Master process...
Dec 03 00:03:54 compute-0 podman[214459]: 2025-12-03 00:03:54.357755684 +0000 UTC m=+0.037953505 container kill cd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 03 00:03:54 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214164]: [ALERT]    (214168) : Current worker (214170) exited with code 143 (Terminated)
Dec 03 00:03:54 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214164]: [WARNING]  (214168) : All workers exited. Exiting... (0)
Dec 03 00:03:54 compute-0 systemd[1]: libpod-cd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba.scope: Deactivated successfully.
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.363 187247 DEBUG nova.compute.manager [req-d95c910a-164c-4f5d-9842-dbe24143b1da req-c7f22df5-07cd-4dac-83ec-1a29ce461227 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.363 187247 DEBUG oslo_concurrency.lockutils [req-d95c910a-164c-4f5d-9842-dbe24143b1da req-c7f22df5-07cd-4dac-83ec-1a29ce461227 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.363 187247 DEBUG oslo_concurrency.lockutils [req-d95c910a-164c-4f5d-9842-dbe24143b1da req-c7f22df5-07cd-4dac-83ec-1a29ce461227 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.364 187247 DEBUG oslo_concurrency.lockutils [req-d95c910a-164c-4f5d-9842-dbe24143b1da req-c7f22df5-07cd-4dac-83ec-1a29ce461227 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.364 187247 DEBUG nova.compute.manager [req-d95c910a-164c-4f5d-9842-dbe24143b1da req-c7f22df5-07cd-4dac-83ec-1a29ce461227 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] No waiting events found dispatching network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.364 187247 DEBUG nova.compute.manager [req-d95c910a-164c-4f5d-9842-dbe24143b1da req-c7f22df5-07cd-4dac-83ec-1a29ce461227 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:03:54 compute-0 podman[214474]: 2025-12-03 00:03:54.407971866 +0000 UTC m=+0.032868511 container died cd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:03:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba-userdata-shm.mount: Deactivated successfully.
Dec 03 00:03:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-7a462ad89309f76660424566f2c45ae6051283bedf9bdba4f8df56094e8a9564-merged.mount: Deactivated successfully.
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.439 187247 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.439 187247 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.439 187247 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.441 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:54 compute-0 podman[214474]: 2025-12-03 00:03:54.445253443 +0000 UTC m=+0.070150068 container cleanup cd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 03 00:03:54 compute-0 systemd[1]: libpod-conmon-cd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba.scope: Deactivated successfully.
Dec 03 00:03:54 compute-0 podman[214476]: 2025-12-03 00:03:54.460929405 +0000 UTC m=+0.077875736 container remove cd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.465 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[08d840bd-cfdd-4d1a-bc98-e7051f1f7cdb]: (4, ("Wed Dec  3 12:03:54 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb (cd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba)\ncd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba\nWed Dec  3 12:03:54 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb (cd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba)\ncd6e01b0d85e414c178d5cea0083e4a42c1cc2a6fd07529966429ab4b673d3ba\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.467 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5e77cfa5-38d8-4af4-8531-4033b68a836c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.467 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.468 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b753a643-245e-4989-b70a-2393180ab142]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.468 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped11b71b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.470 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:54 compute-0 kernel: taped11b71b-70: left promiscuous mode
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.485 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.485 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.487 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[df700f2a-4c56-4818-92c2-303325afe2a7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.511 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4e1d7ad6-a4fa-4577-9c76-01ebd7ae3bd7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.512 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad4647a-5de0-4b40-95f4-4bca16984a89]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.528 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[33f30318-4594-47cc-acfc-8e593f5cbddd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 427261, 'reachable_time': 36031, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214524, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.529 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:03:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:03:54.530 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[ac6d78c5-cb56-44a5-a81c-bddb4d8b7ad8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:03:54 compute-0 systemd[1]: run-netns-ovnmeta\x2ded11b71b\x2d745b\x2d4f0c\x2d9f09\x2d37d53d166bcb.mount: Deactivated successfully.
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.674 187247 DEBUG nova.virt.libvirt.guest [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '1c6c7975-72fd-442a-b75f-0baede84a60b' (instance-0000000c) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.675 187247 INFO nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Migration operation has completed
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.676 187247 INFO nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] _post_live_migration() is started..
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.693 187247 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:54 compute-0 nova_compute[187243]: 2025-12-03 00:03:54.693 187247 WARNING neutronclient.v2_0.client [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.809 187247 DEBUG nova.network.neutron [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port 7b8033d7-6209-4ba1-8605-72623902a9a9 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.810 187247 DEBUG nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.810 187247 DEBUG nova.virt.libvirt.vif [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:02:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-233597543',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-233597543',id=12,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:03:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-3he48fro',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:03:33Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=1c6c7975-72fd-442a-b75f-0baede84a60b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.811 187247 DEBUG nova.network.os_vif_util [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "7b8033d7-6209-4ba1-8605-72623902a9a9", "address": "fa:16:3e:68:79:f2", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b8033d7-62", "ovs_interfaceid": "7b8033d7-6209-4ba1-8605-72623902a9a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.811 187247 DEBUG nova.network.os_vif_util [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.811 187247 DEBUG os_vif [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.813 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.813 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b8033d7-62, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.814 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.816 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.817 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.817 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=ba6e8c6e-bd30-4228-8b86-a14985303c90) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.818 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.819 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.821 187247 INFO os_vif [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:79:f2,bridge_name='br-int',has_traffic_filtering=True,id=7b8033d7-6209-4ba1-8605-72623902a9a9,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b8033d7-62')
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.821 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.821 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.822 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.822 187247 DEBUG nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.822 187247 INFO nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Deleting instance files /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b_del
Dec 03 00:03:55 compute-0 nova_compute[187243]: 2025-12-03 00:03:55.823 187247 INFO nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Deletion of /var/lib/nova/instances/1c6c7975-72fd-442a-b75f-0baede84a60b_del complete
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.426 187247 DEBUG nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.426 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.427 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.427 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.427 187247 DEBUG nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] No waiting events found dispatching network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.427 187247 WARNING nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received unexpected event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 for instance with vm_state active and task_state migrating.
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.427 187247 DEBUG nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.428 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.428 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.428 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.428 187247 DEBUG nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] No waiting events found dispatching network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.428 187247 DEBUG nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.428 187247 DEBUG nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.429 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.429 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.429 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.429 187247 DEBUG nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] No waiting events found dispatching network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.429 187247 DEBUG nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-unplugged-7b8033d7-6209-4ba1-8605-72623902a9a9 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.430 187247 DEBUG nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.430 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.430 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.430 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.430 187247 DEBUG nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] No waiting events found dispatching network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.430 187247 WARNING nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received unexpected event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 for instance with vm_state active and task_state migrating.
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.431 187247 DEBUG nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.431 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.431 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.431 187247 DEBUG oslo_concurrency.lockutils [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.431 187247 DEBUG nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] No waiting events found dispatching network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.431 187247 WARNING nova.compute.manager [req-934cde83-2fa3-4586-86a2-b14d8e7db8e7 req-408a9a0b-b248-44f9-a691-9d0bbc1a2ff9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Received unexpected event network-vif-plugged-7b8033d7-6209-4ba1-8605-72623902a9a9 for instance with vm_state active and task_state migrating.
Dec 03 00:03:56 compute-0 nova_compute[187243]: 2025-12-03 00:03:56.955 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:03:59 compute-0 podman[214525]: 2025-12-03 00:03:59.111664508 +0000 UTC m=+0.065348691 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Dec 03 00:03:59 compute-0 podman[197600]: time="2025-12-03T00:03:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:03:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:03:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:03:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:03:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec 03 00:04:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:00.690 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:00.691 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:00.691 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:00 compute-0 nova_compute[187243]: 2025-12-03 00:04:00.818 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:01 compute-0 openstack_network_exporter[199746]: ERROR   00:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:04:01 compute-0 openstack_network_exporter[199746]: ERROR   00:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:04:01 compute-0 openstack_network_exporter[199746]: ERROR   00:04:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:04:01 compute-0 openstack_network_exporter[199746]: ERROR   00:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:04:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:04:01 compute-0 openstack_network_exporter[199746]: ERROR   00:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:04:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:04:01 compute-0 nova_compute[187243]: 2025-12-03 00:04:01.996 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:04 compute-0 nova_compute[187243]: 2025-12-03 00:04:04.864 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:04 compute-0 nova_compute[187243]: 2025-12-03 00:04:04.864 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:04 compute-0 nova_compute[187243]: 2025-12-03 00:04:04.865 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "1c6c7975-72fd-442a-b75f-0baede84a60b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:05 compute-0 nova_compute[187243]: 2025-12-03 00:04:05.380 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:05 compute-0 nova_compute[187243]: 2025-12-03 00:04:05.380 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:05 compute-0 nova_compute[187243]: 2025-12-03 00:04:05.380 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:05 compute-0 nova_compute[187243]: 2025-12-03 00:04:05.381 187247 DEBUG nova.compute.resource_tracker [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:04:05 compute-0 nova_compute[187243]: 2025-12-03 00:04:05.523 187247 WARNING nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:04:05 compute-0 nova_compute[187243]: 2025-12-03 00:04:05.524 187247 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:05 compute-0 nova_compute[187243]: 2025-12-03 00:04:05.559 187247 DEBUG oslo_concurrency.processutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:05 compute-0 nova_compute[187243]: 2025-12-03 00:04:05.559 187247 DEBUG nova.compute.resource_tracker [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5848MB free_disk=73.16493606567383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:04:05 compute-0 nova_compute[187243]: 2025-12-03 00:04:05.560 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:05 compute-0 nova_compute[187243]: 2025-12-03 00:04:05.560 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:05 compute-0 nova_compute[187243]: 2025-12-03 00:04:05.819 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:06 compute-0 nova_compute[187243]: 2025-12-03 00:04:06.581 187247 DEBUG nova.compute.resource_tracker [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance 1c6c7975-72fd-442a-b75f-0baede84a60b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:04:07 compute-0 nova_compute[187243]: 2025-12-03 00:04:06.999 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:07 compute-0 nova_compute[187243]: 2025-12-03 00:04:07.088 187247 DEBUG nova.compute.resource_tracker [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:04:07 compute-0 nova_compute[187243]: 2025-12-03 00:04:07.128 187247 DEBUG nova.compute.resource_tracker [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration dcc97c02-db8d-4b14-b48e-41025617bbf0 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:04:07 compute-0 nova_compute[187243]: 2025-12-03 00:04:07.129 187247 DEBUG nova.compute.resource_tracker [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:04:07 compute-0 nova_compute[187243]: 2025-12-03 00:04:07.129 187247 DEBUG nova.compute.resource_tracker [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:04:05 up  1:12,  0 user,  load average: 0.07, 0.19, 0.33\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:04:07 compute-0 nova_compute[187243]: 2025-12-03 00:04:07.172 187247 DEBUG nova.compute.provider_tree [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:04:07 compute-0 nova_compute[187243]: 2025-12-03 00:04:07.681 187247 DEBUG nova.scheduler.client.report [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:04:08 compute-0 nova_compute[187243]: 2025-12-03 00:04:08.190 187247 DEBUG nova.compute.resource_tracker [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:04:08 compute-0 nova_compute[187243]: 2025-12-03 00:04:08.190 187247 DEBUG oslo_concurrency.lockutils [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.630s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:08 compute-0 nova_compute[187243]: 2025-12-03 00:04:08.206 187247 INFO nova.compute.manager [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 03 00:04:09 compute-0 podman[214548]: 2025-12-03 00:04:09.102094941 +0000 UTC m=+0.059067388 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:04:09 compute-0 nova_compute[187243]: 2025-12-03 00:04:09.310 187247 INFO nova.scheduler.client.report [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration dcc97c02-db8d-4b14-b48e-41025617bbf0
Dec 03 00:04:09 compute-0 nova_compute[187243]: 2025-12-03 00:04:09.310 187247 DEBUG nova.virt.libvirt.driver [None req-5bdedca3-064b-41cd-a220-188cecb15194 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 1c6c7975-72fd-442a-b75f-0baede84a60b] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:04:10 compute-0 nova_compute[187243]: 2025-12-03 00:04:10.820 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:11 compute-0 podman[214572]: 2025-12-03 00:04:11.089696388 +0000 UTC m=+0.051416032 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Dec 03 00:04:11 compute-0 podman[214573]: 2025-12-03 00:04:11.142668667 +0000 UTC m=+0.097660377 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 00:04:12 compute-0 nova_compute[187243]: 2025-12-03 00:04:12.003 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:14 compute-0 sshd-session[214618]: Received disconnect from 102.210.148.92 port 56160:11: Bye Bye [preauth]
Dec 03 00:04:14 compute-0 sshd-session[214618]: Disconnected from authenticating user root 102.210.148.92 port 56160 [preauth]
Dec 03 00:04:14 compute-0 nova_compute[187243]: 2025-12-03 00:04:14.604 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:14 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:14.604 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:04:14 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:14.605 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:04:15 compute-0 nova_compute[187243]: 2025-12-03 00:04:15.821 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:17 compute-0 nova_compute[187243]: 2025-12-03 00:04:17.054 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:20 compute-0 nova_compute[187243]: 2025-12-03 00:04:20.838 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:20 compute-0 nova_compute[187243]: 2025-12-03 00:04:20.917 187247 DEBUG nova.compute.manager [None req-8e2994e1-9d1f-4d08-a31d-c20e6e26de91 7ede684cab6e46758f9d1100711cfe79 22106c97f2524355a0bbadb98eaf5c22 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Dec 03 00:04:20 compute-0 nova_compute[187243]: 2025-12-03 00:04:20.959 187247 DEBUG nova.compute.provider_tree [None req-8e2994e1-9d1f-4d08-a31d-c20e6e26de91 7ede684cab6e46758f9d1100711cfe79 22106c97f2524355a0bbadb98eaf5c22 - - default default] Updating resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 generation from 18 to 21 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 03 00:04:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:21.606 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:22 compute-0 nova_compute[187243]: 2025-12-03 00:04:22.096 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:25 compute-0 podman[214624]: 2025-12-03 00:04:25.105620448 +0000 UTC m=+0.060254917 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 03 00:04:25 compute-0 sshd-session[214622]: Received disconnect from 49.247.36.49 port 59016:11: Bye Bye [preauth]
Dec 03 00:04:25 compute-0 sshd-session[214622]: Disconnected from authenticating user root 49.247.36.49 port 59016 [preauth]
Dec 03 00:04:25 compute-0 nova_compute[187243]: 2025-12-03 00:04:25.878 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:26 compute-0 nova_compute[187243]: 2025-12-03 00:04:26.217 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:26 compute-0 nova_compute[187243]: 2025-12-03 00:04:26.218 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:26 compute-0 nova_compute[187243]: 2025-12-03 00:04:26.722 187247 DEBUG nova.compute.manager [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:04:27 compute-0 nova_compute[187243]: 2025-12-03 00:04:27.136 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:27 compute-0 nova_compute[187243]: 2025-12-03 00:04:27.282 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:27 compute-0 nova_compute[187243]: 2025-12-03 00:04:27.283 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:27 compute-0 nova_compute[187243]: 2025-12-03 00:04:27.291 187247 DEBUG nova.virt.hardware [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:04:27 compute-0 nova_compute[187243]: 2025-12-03 00:04:27.291 187247 INFO nova.compute.claims [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:04:28 compute-0 nova_compute[187243]: 2025-12-03 00:04:28.356 187247 DEBUG nova.compute.provider_tree [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:04:28 compute-0 sshd-session[214649]: Invalid user cc from 20.123.120.169 port 51616
Dec 03 00:04:28 compute-0 sshd-session[214649]: Received disconnect from 20.123.120.169 port 51616:11: Bye Bye [preauth]
Dec 03 00:04:28 compute-0 sshd-session[214649]: Disconnected from invalid user cc 20.123.120.169 port 51616 [preauth]
Dec 03 00:04:28 compute-0 nova_compute[187243]: 2025-12-03 00:04:28.865 187247 DEBUG nova.scheduler.client.report [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:04:29 compute-0 nova_compute[187243]: 2025-12-03 00:04:29.377 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:29 compute-0 nova_compute[187243]: 2025-12-03 00:04:29.378 187247 DEBUG nova.compute.manager [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:04:29 compute-0 sshd-session[214647]: Received disconnect from 45.78.222.160 port 56490:11: Bye Bye [preauth]
Dec 03 00:04:29 compute-0 sshd-session[214647]: Disconnected from authenticating user root 45.78.222.160 port 56490 [preauth]
Dec 03 00:04:29 compute-0 podman[197600]: time="2025-12-03T00:04:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:04:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:04:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:04:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:04:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2595 "" "Go-http-client/1.1"
Dec 03 00:04:29 compute-0 nova_compute[187243]: 2025-12-03 00:04:29.889 187247 DEBUG nova.compute.manager [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:04:29 compute-0 nova_compute[187243]: 2025-12-03 00:04:29.890 187247 DEBUG nova.network.neutron [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:04:29 compute-0 nova_compute[187243]: 2025-12-03 00:04:29.890 187247 WARNING neutronclient.v2_0.client [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:29 compute-0 nova_compute[187243]: 2025-12-03 00:04:29.891 187247 WARNING neutronclient.v2_0.client [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:30 compute-0 podman[214651]: 2025-12-03 00:04:30.106984083 +0000 UTC m=+0.061162400 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 03 00:04:30 compute-0 nova_compute[187243]: 2025-12-03 00:04:30.403 187247 INFO nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:04:30 compute-0 nova_compute[187243]: 2025-12-03 00:04:30.881 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:30 compute-0 nova_compute[187243]: 2025-12-03 00:04:30.910 187247 DEBUG nova.compute.manager [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:04:31 compute-0 openstack_network_exporter[199746]: ERROR   00:04:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:04:31 compute-0 openstack_network_exporter[199746]: ERROR   00:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:04:31 compute-0 openstack_network_exporter[199746]: ERROR   00:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:04:31 compute-0 openstack_network_exporter[199746]: ERROR   00:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:04:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:04:31 compute-0 openstack_network_exporter[199746]: ERROR   00:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:04:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:04:31 compute-0 nova_compute[187243]: 2025-12-03 00:04:31.453 187247 DEBUG nova.network.neutron [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Successfully created port: 3fc60c87-0094-403e-9fb0-564004da22b1 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:04:31 compute-0 nova_compute[187243]: 2025-12-03 00:04:31.946 187247 DEBUG nova.compute.manager [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:04:31 compute-0 nova_compute[187243]: 2025-12-03 00:04:31.947 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:04:31 compute-0 nova_compute[187243]: 2025-12-03 00:04:31.947 187247 INFO nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Creating image(s)
Dec 03 00:04:31 compute-0 nova_compute[187243]: 2025-12-03 00:04:31.948 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:31 compute-0 nova_compute[187243]: 2025-12-03 00:04:31.948 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:31 compute-0 nova_compute[187243]: 2025-12-03 00:04:31.949 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:31 compute-0 nova_compute[187243]: 2025-12-03 00:04:31.949 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:04:31 compute-0 nova_compute[187243]: 2025-12-03 00:04:31.952 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:04:31 compute-0 nova_compute[187243]: 2025-12-03 00:04:31.953 187247 DEBUG oslo_concurrency.processutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.024 187247 DEBUG oslo_concurrency.processutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.025 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.025 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.026 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.029 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.029 187247 DEBUG oslo_concurrency.processutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.076 187247 DEBUG oslo_concurrency.processutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.077 187247 DEBUG oslo_concurrency.processutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.105 187247 DEBUG oslo_concurrency.processutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.106 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.081s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.107 187247 DEBUG oslo_concurrency.processutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.138 187247 DEBUG nova.network.neutron [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Successfully updated port: 3fc60c87-0094-403e-9fb0-564004da22b1 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.180 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.190 187247 DEBUG nova.compute.manager [req-8d314886-380d-4aa0-915e-50c0fb5b7e22 req-7bb86da0-c52e-4e06-8360-7594122657b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-changed-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.190 187247 DEBUG nova.compute.manager [req-8d314886-380d-4aa0-915e-50c0fb5b7e22 req-7bb86da0-c52e-4e06-8360-7594122657b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Refreshing instance network info cache due to event network-changed-3fc60c87-0094-403e-9fb0-564004da22b1. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.191 187247 DEBUG oslo_concurrency.lockutils [req-8d314886-380d-4aa0-915e-50c0fb5b7e22 req-7bb86da0-c52e-4e06-8360-7594122657b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.191 187247 DEBUG oslo_concurrency.lockutils [req-8d314886-380d-4aa0-915e-50c0fb5b7e22 req-7bb86da0-c52e-4e06-8360-7594122657b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.191 187247 DEBUG nova.network.neutron [req-8d314886-380d-4aa0-915e-50c0fb5b7e22 req-7bb86da0-c52e-4e06-8360-7594122657b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Refreshing network info cache for port 3fc60c87-0094-403e-9fb0-564004da22b1 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.222 187247 DEBUG oslo_concurrency.processutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.115s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.223 187247 DEBUG nova.virt.disk.api [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Checking if we can resize image /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.224 187247 DEBUG oslo_concurrency.processutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.280 187247 DEBUG oslo_concurrency.processutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.281 187247 DEBUG nova.virt.disk.api [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Cannot resize image /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.281 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.281 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Ensure instance console log exists: /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.282 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.282 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.282 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.679 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.698 187247 WARNING neutronclient.v2_0.client [req-8d314886-380d-4aa0-915e-50c0fb5b7e22 req-7bb86da0-c52e-4e06-8360-7594122657b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.780 187247 DEBUG nova.network.neutron [req-8d314886-380d-4aa0-915e-50c0fb5b7e22 req-7bb86da0-c52e-4e06-8360-7594122657b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:04:32 compute-0 nova_compute[187243]: 2025-12-03 00:04:32.958 187247 DEBUG nova.network.neutron [req-8d314886-380d-4aa0-915e-50c0fb5b7e22 req-7bb86da0-c52e-4e06-8360-7594122657b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:04:33 compute-0 nova_compute[187243]: 2025-12-03 00:04:33.466 187247 DEBUG oslo_concurrency.lockutils [req-8d314886-380d-4aa0-915e-50c0fb5b7e22 req-7bb86da0-c52e-4e06-8360-7594122657b4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:04:33 compute-0 nova_compute[187243]: 2025-12-03 00:04:33.467 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquired lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:04:33 compute-0 nova_compute[187243]: 2025-12-03 00:04:33.467 187247 DEBUG nova.network.neutron [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:04:34 compute-0 nova_compute[187243]: 2025-12-03 00:04:34.292 187247 DEBUG nova.network.neutron [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:04:34 compute-0 nova_compute[187243]: 2025-12-03 00:04:34.457 187247 WARNING neutronclient.v2_0.client [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:34 compute-0 nova_compute[187243]: 2025-12-03 00:04:34.602 187247 DEBUG nova.network.neutron [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Updating instance_info_cache with network_info: [{"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.108 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Releasing lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.109 187247 DEBUG nova.compute.manager [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Instance network_info: |[{"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.111 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Start _get_guest_xml network_info=[{"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.114 187247 WARNING nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.115 187247 DEBUG nova.virt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1547033723', uuid='5d86e858-6a62-411e-a8dc-dffcfa247bfc'), owner=OwnerMeta(userid='ab182b4a69794d1fa103fbd3d503df99', username='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin', projectid='85e2f91a92cf4b5a9d626e8418f17322', projectname='tempest-TestExecuteHostMaintenanceStrategy-1767783627'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720275.1159227) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.119 187247 DEBUG nova.virt.libvirt.host [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.120 187247 DEBUG nova.virt.libvirt.host [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.122 187247 DEBUG nova.virt.libvirt.host [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.122 187247 DEBUG nova.virt.libvirt.host [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.123 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.123 187247 DEBUG nova.virt.hardware [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.123 187247 DEBUG nova.virt.hardware [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.124 187247 DEBUG nova.virt.hardware [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.124 187247 DEBUG nova.virt.hardware [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.124 187247 DEBUG nova.virt.hardware [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.124 187247 DEBUG nova.virt.hardware [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.124 187247 DEBUG nova.virt.hardware [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.124 187247 DEBUG nova.virt.hardware [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.125 187247 DEBUG nova.virt.hardware [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.125 187247 DEBUG nova.virt.hardware [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.125 187247 DEBUG nova.virt.hardware [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.128 187247 DEBUG nova.virt.libvirt.vif [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:04:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1547033723',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1547033723',id=14,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-rbawllbh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:04:30Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=5d86e858-6a62-411e-a8dc-dffcfa247bfc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.128 187247 DEBUG nova.network.os_vif_util [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converting VIF {"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.128 187247 DEBUG nova.network.os_vif_util [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.129 187247 DEBUG nova.objects.instance [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5d86e858-6a62-411e-a8dc-dffcfa247bfc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.634 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:04:35 compute-0 nova_compute[187243]:   <uuid>5d86e858-6a62-411e-a8dc-dffcfa247bfc</uuid>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   <name>instance-0000000e</name>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1547033723</nova:name>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:04:35</nova:creationTime>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:04:35 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:04:35 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:user uuid="ab182b4a69794d1fa103fbd3d503df99">tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin</nova:user>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:project uuid="85e2f91a92cf4b5a9d626e8418f17322">tempest-TestExecuteHostMaintenanceStrategy-1767783627</nova:project>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         <nova:port uuid="3fc60c87-0094-403e-9fb0-564004da22b1">
Dec 03 00:04:35 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <system>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <entry name="serial">5d86e858-6a62-411e-a8dc-dffcfa247bfc</entry>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <entry name="uuid">5d86e858-6a62-411e-a8dc-dffcfa247bfc</entry>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     </system>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   <os>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   </os>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   <features>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   </features>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk.config"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:6d:ee:e6"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <target dev="tap3fc60c87-00"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/console.log" append="off"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <video>
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     </video>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:04:35 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:04:35 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:04:35 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:04:35 compute-0 nova_compute[187243]: </domain>
Dec 03 00:04:35 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.636 187247 DEBUG nova.compute.manager [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Preparing to wait for external event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.636 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.637 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.637 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.639 187247 DEBUG nova.virt.libvirt.vif [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:04:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1547033723',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1547033723',id=14,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-rbawllbh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:04:30Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=5d86e858-6a62-411e-a8dc-dffcfa247bfc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.639 187247 DEBUG nova.network.os_vif_util [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converting VIF {"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.640 187247 DEBUG nova.network.os_vif_util [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.641 187247 DEBUG os_vif [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.642 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.642 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.643 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.644 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.644 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'b367e7bd-c8dd-5599-9ab1-45aab2851543', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.646 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.649 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.651 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.652 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3fc60c87-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.652 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3fc60c87-00, col_values=(('qos', UUID('38a0319a-9d81-467a-baec-72f5b209e699')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.653 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3fc60c87-00, col_values=(('external_ids', {'iface-id': '3fc60c87-0094-403e-9fb0-564004da22b1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:ee:e6', 'vm-uuid': '5d86e858-6a62-411e-a8dc-dffcfa247bfc'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.654 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:35 compute-0 NetworkManager[55671]: <info>  [1764720275.6552] manager: (tap3fc60c87-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.657 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.659 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:35 compute-0 nova_compute[187243]: 2025-12-03 00:04:35.660 187247 INFO os_vif [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00')
Dec 03 00:04:36 compute-0 nova_compute[187243]: 2025-12-03 00:04:36.713 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:36 compute-0 nova_compute[187243]: 2025-12-03 00:04:36.714 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:36 compute-0 nova_compute[187243]: 2025-12-03 00:04:36.714 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:36 compute-0 nova_compute[187243]: 2025-12-03 00:04:36.715 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:04:37 compute-0 nova_compute[187243]: 2025-12-03 00:04:37.181 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:37 compute-0 nova_compute[187243]: 2025-12-03 00:04:37.199 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:04:37 compute-0 nova_compute[187243]: 2025-12-03 00:04:37.199 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:04:37 compute-0 nova_compute[187243]: 2025-12-03 00:04:37.200 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] No VIF found with MAC fa:16:3e:6d:ee:e6, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:04:37 compute-0 nova_compute[187243]: 2025-12-03 00:04:37.201 187247 INFO nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Using config drive
Dec 03 00:04:37 compute-0 nova_compute[187243]: 2025-12-03 00:04:37.709 187247 WARNING neutronclient.v2_0.client [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.004 187247 INFO nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Creating config drive at /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk.config
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.008 187247 DEBUG oslo_concurrency.processutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpyikisqa7 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.142 187247 DEBUG oslo_concurrency.processutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpyikisqa7" returned: 0 in 0.134s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:38 compute-0 kernel: tap3fc60c87-00: entered promiscuous mode
Dec 03 00:04:38 compute-0 NetworkManager[55671]: <info>  [1764720278.2058] manager: (tap3fc60c87-00): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Dec 03 00:04:38 compute-0 ovn_controller[95488]: 2025-12-03T00:04:38Z|00111|binding|INFO|Claiming lport 3fc60c87-0094-403e-9fb0-564004da22b1 for this chassis.
Dec 03 00:04:38 compute-0 ovn_controller[95488]: 2025-12-03T00:04:38Z|00112|binding|INFO|3fc60c87-0094-403e-9fb0-564004da22b1: Claiming fa:16:3e:6d:ee:e6 10.100.0.11
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.207 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.217 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:ee:e6 10.100.0.11'], port_security=['fa:16:3e:6d:ee:e6 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5d86e858-6a62-411e-a8dc-dffcfa247bfc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2256d612-5a1d-4528-93f3-139a5d1ff76a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=3fc60c87-0094-403e-9fb0-564004da22b1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.218 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 3fc60c87-0094-403e-9fb0-564004da22b1 in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb bound to our chassis
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.221 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:04:38 compute-0 ovn_controller[95488]: 2025-12-03T00:04:38Z|00113|binding|INFO|Setting lport 3fc60c87-0094-403e-9fb0-564004da22b1 ovn-installed in OVS
Dec 03 00:04:38 compute-0 ovn_controller[95488]: 2025-12-03T00:04:38Z|00114|binding|INFO|Setting lport 3fc60c87-0094-403e-9fb0-564004da22b1 up in Southbound
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.224 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.234 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[64c61386-c650-49ee-b262-9f19708cc1f8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.235 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH taped11b71b-71 in ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.237 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface taped11b71b-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.237 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[693c9f52-fbe7-458b-b424-237cc83b27ba]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.238 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ada9014c-114c-4e62-8224-455277010385]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 systemd-udevd[214705]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:04:38 compute-0 systemd-machined[153518]: New machine qemu-9-instance-0000000e.
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.258 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e6dba9-c7f3-4b50-9dc9-16253ba818d8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 NetworkManager[55671]: <info>  [1764720278.2659] device (tap3fc60c87-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:04:38 compute-0 NetworkManager[55671]: <info>  [1764720278.2675] device (tap3fc60c87-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.266 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[84c4596d-62be-41a2-b4b6-4c9772050cae]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000e.
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.298 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a70d7e-b0d6-451c-9207-933fb8408d73]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.303 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[407f6db0-593b-4dfe-ab20-2d21e8a550ca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 NetworkManager[55671]: <info>  [1764720278.3059] manager: (taped11b71b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/50)
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.331 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ff8755-a65d-4500-b892-52192c5bff08]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.334 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[ada13f9f-3abb-4e02-93be-3516f4366866]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 NetworkManager[55671]: <info>  [1764720278.3604] device (taped11b71b-70): carrier: link connected
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.368 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[2812f315-d4a3-4f06-8458-03f35c0566a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.388 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb0b9eb-3dff-4673-a417-48c89e09a267]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped11b71b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:bb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436900, 'reachable_time': 29337, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214739, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.397 187247 DEBUG nova.compute.manager [req-97f1e2b1-0ca6-49b0-9498-ca2b2abf8f24 req-8bb3ecdf-3801-40a6-b254-c513aa6e8129 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.398 187247 DEBUG oslo_concurrency.lockutils [req-97f1e2b1-0ca6-49b0-9498-ca2b2abf8f24 req-8bb3ecdf-3801-40a6-b254-c513aa6e8129 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.398 187247 DEBUG oslo_concurrency.lockutils [req-97f1e2b1-0ca6-49b0-9498-ca2b2abf8f24 req-8bb3ecdf-3801-40a6-b254-c513aa6e8129 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.398 187247 DEBUG oslo_concurrency.lockutils [req-97f1e2b1-0ca6-49b0-9498-ca2b2abf8f24 req-8bb3ecdf-3801-40a6-b254-c513aa6e8129 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.398 187247 DEBUG nova.compute.manager [req-97f1e2b1-0ca6-49b0-9498-ca2b2abf8f24 req-8bb3ecdf-3801-40a6-b254-c513aa6e8129 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Processing event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.413 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e6334371-a1ef-4237-8ec3-8309827a2c0d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6d:bbc6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 436900, 'tstamp': 436900}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214740, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.437 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce3bfea-7e19-4345-9191-7b9a0519dd28]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped11b71b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:bb:c6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436900, 'reachable_time': 29337, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214741, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.479 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ae9f0fe3-306b-4aee-af03-25f1772f1260]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.573 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4578ae63-ef20-47cb-ae0d-e3f4f6bce550]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.575 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped11b71b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.575 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.576 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped11b71b-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.579 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:38 compute-0 kernel: taped11b71b-70: entered promiscuous mode
Dec 03 00:04:38 compute-0 NetworkManager[55671]: <info>  [1764720278.5804] manager: (taped11b71b-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.584 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.585 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped11b71b-70, col_values=(('external_ids', {'iface-id': 'add6ea4f-8836-4bed-8f1e-39e943ccf4b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.587 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:38 compute-0 ovn_controller[95488]: 2025-12-03T00:04:38Z|00115|binding|INFO|Releasing lport add6ea4f-8836-4bed-8f1e-39e943ccf4b5 from this chassis (sb_readonly=0)
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:38 compute-0 nova_compute[187243]: 2025-12-03 00:04:38.610 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.612 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[64d222f8-fa5c-4671-b3c3-1b9b8149d3d9]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.613 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.613 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.613 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for ed11b71b-745b-4f0c-9f09-37d53d166bcb disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.613 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.614 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[76ff5627-5fb0-4dd7-b9f1-0426c9a42995]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.614 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.615 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6b83930f-e127-4e3b-98e1-81c613a46883]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.615 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: global
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: defaults
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     log global
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID ed11b71b-745b-4f0c-9f09-37d53d166bcb
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:04:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:04:38.616 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'env', 'PROCESS_TAG=haproxy-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ed11b71b-745b-4f0c-9f09-37d53d166bcb.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:04:39 compute-0 podman[214773]: 2025-12-03 00:04:39.019217537 +0000 UTC m=+0.058698409 container create cf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 03 00:04:39 compute-0 systemd[1]: Started libpod-conmon-cf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b.scope.
Dec 03 00:04:39 compute-0 podman[214773]: 2025-12-03 00:04:38.990745284 +0000 UTC m=+0.030226176 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:04:39 compute-0 systemd[1]: Started libcrun container.
Dec 03 00:04:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6eaf3ec3651262a52b04424801621059840c5fc6715bd8e9c20ce7c1f1e162be/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:04:39 compute-0 podman[214773]: 2025-12-03 00:04:39.098058716 +0000 UTC m=+0.137539608 container init cf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 03 00:04:39 compute-0 podman[214773]: 2025-12-03 00:04:39.102691908 +0000 UTC m=+0.142172780 container start cf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 03 00:04:39 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214788]: [NOTICE]   (214792) : New worker (214794) forked
Dec 03 00:04:39 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214788]: [NOTICE]   (214792) : Loading success.
Dec 03 00:04:39 compute-0 podman[214805]: 2025-12-03 00:04:39.48000738 +0000 UTC m=+0.058209647 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:04:39 compute-0 nova_compute[187243]: 2025-12-03 00:04:39.505 187247 DEBUG nova.compute.manager [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:04:39 compute-0 nova_compute[187243]: 2025-12-03 00:04:39.508 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:04:39 compute-0 nova_compute[187243]: 2025-12-03 00:04:39.511 187247 INFO nova.virt.libvirt.driver [-] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Instance spawned successfully.
Dec 03 00:04:39 compute-0 nova_compute[187243]: 2025-12-03 00:04:39.512 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.028 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.029 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.030 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.031 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.031 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.032 187247 DEBUG nova.virt.libvirt.driver [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.487 187247 DEBUG nova.compute.manager [req-8410b5d8-bfc0-4286-abb2-6360abefde7b req-186c7222-b819-4ac6-b7c2-df5a7ba29c21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.487 187247 DEBUG oslo_concurrency.lockutils [req-8410b5d8-bfc0-4286-abb2-6360abefde7b req-186c7222-b819-4ac6-b7c2-df5a7ba29c21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.488 187247 DEBUG oslo_concurrency.lockutils [req-8410b5d8-bfc0-4286-abb2-6360abefde7b req-186c7222-b819-4ac6-b7c2-df5a7ba29c21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.488 187247 DEBUG oslo_concurrency.lockutils [req-8410b5d8-bfc0-4286-abb2-6360abefde7b req-186c7222-b819-4ac6-b7c2-df5a7ba29c21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.488 187247 DEBUG nova.compute.manager [req-8410b5d8-bfc0-4286-abb2-6360abefde7b req-186c7222-b819-4ac6-b7c2-df5a7ba29c21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] No waiting events found dispatching network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.488 187247 WARNING nova.compute.manager [req-8410b5d8-bfc0-4286-abb2-6360abefde7b req-186c7222-b819-4ac6-b7c2-df5a7ba29c21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received unexpected event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 for instance with vm_state building and task_state spawning.
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.544 187247 INFO nova.compute.manager [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Took 8.60 seconds to spawn the instance on the hypervisor.
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.544 187247 DEBUG nova.compute.manager [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:04:40 compute-0 nova_compute[187243]: 2025-12-03 00:04:40.699 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:41 compute-0 nova_compute[187243]: 2025-12-03 00:04:41.080 187247 INFO nova.compute.manager [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Took 13.84 seconds to build instance.
Dec 03 00:04:41 compute-0 nova_compute[187243]: 2025-12-03 00:04:41.586 187247 DEBUG oslo_concurrency.lockutils [None req-59c8efe4-90ab-4f2c-9ca2-6cc1d9113e80 ab182b4a69794d1fa103fbd3d503df99 85e2f91a92cf4b5a9d626e8418f17322 - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.369s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:41 compute-0 nova_compute[187243]: 2025-12-03 00:04:41.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:41 compute-0 nova_compute[187243]: 2025-12-03 00:04:41.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:41 compute-0 nova_compute[187243]: 2025-12-03 00:04:41.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:42 compute-0 podman[214834]: 2025-12-03 00:04:42.099474834 +0000 UTC m=+0.051101815 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 03 00:04:42 compute-0 nova_compute[187243]: 2025-12-03 00:04:42.115 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:42 compute-0 nova_compute[187243]: 2025-12-03 00:04:42.116 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:42 compute-0 nova_compute[187243]: 2025-12-03 00:04:42.116 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:42 compute-0 nova_compute[187243]: 2025-12-03 00:04:42.116 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:04:42 compute-0 podman[214835]: 2025-12-03 00:04:42.149029609 +0000 UTC m=+0.093807253 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:04:42 compute-0 nova_compute[187243]: 2025-12-03 00:04:42.182 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:43 compute-0 nova_compute[187243]: 2025-12-03 00:04:43.156 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:43 compute-0 nova_compute[187243]: 2025-12-03 00:04:43.206 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:43 compute-0 nova_compute[187243]: 2025-12-03 00:04:43.207 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:43 compute-0 nova_compute[187243]: 2025-12-03 00:04:43.257 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:43 compute-0 nova_compute[187243]: 2025-12-03 00:04:43.374 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:04:43 compute-0 nova_compute[187243]: 2025-12-03 00:04:43.375 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:04:43 compute-0 nova_compute[187243]: 2025-12-03 00:04:43.392 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:04:43 compute-0 nova_compute[187243]: 2025-12-03 00:04:43.393 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5648MB free_disk=73.16399002075195GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:04:43 compute-0 nova_compute[187243]: 2025-12-03 00:04:43.393 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:04:43 compute-0 nova_compute[187243]: 2025-12-03 00:04:43.394 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:04:44 compute-0 nova_compute[187243]: 2025-12-03 00:04:44.474 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance 5d86e858-6a62-411e-a8dc-dffcfa247bfc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:04:44 compute-0 nova_compute[187243]: 2025-12-03 00:04:44.474 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:04:44 compute-0 nova_compute[187243]: 2025-12-03 00:04:44.475 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:04:43 up  1:12,  0 user,  load average: 0.12, 0.18, 0.33\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_85e2f91a92cf4b5a9d626e8418f17322': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:04:44 compute-0 nova_compute[187243]: 2025-12-03 00:04:44.523 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:04:45 compute-0 nova_compute[187243]: 2025-12-03 00:04:45.030 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:04:45 compute-0 nova_compute[187243]: 2025-12-03 00:04:45.539 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:04:45 compute-0 nova_compute[187243]: 2025-12-03 00:04:45.540 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.146s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:04:45 compute-0 nova_compute[187243]: 2025-12-03 00:04:45.702 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:47 compute-0 nova_compute[187243]: 2025-12-03 00:04:47.186 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:50 compute-0 nova_compute[187243]: 2025-12-03 00:04:50.706 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:52 compute-0 nova_compute[187243]: 2025-12-03 00:04:52.231 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:52 compute-0 ovn_controller[95488]: 2025-12-03T00:04:52Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6d:ee:e6 10.100.0.11
Dec 03 00:04:52 compute-0 ovn_controller[95488]: 2025-12-03T00:04:52Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6d:ee:e6 10.100.0.11
Dec 03 00:04:53 compute-0 nova_compute[187243]: 2025-12-03 00:04:53.536 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:04:55 compute-0 nova_compute[187243]: 2025-12-03 00:04:55.752 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:56 compute-0 podman[214900]: 2025-12-03 00:04:56.10336695 +0000 UTC m=+0.058094684 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 03 00:04:57 compute-0 nova_compute[187243]: 2025-12-03 00:04:57.234 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:04:59 compute-0 podman[197600]: time="2025-12-03T00:04:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:04:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:04:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:04:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:04:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3067 "" "Go-http-client/1.1"
Dec 03 00:05:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:00.692 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:00.692 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:00.693 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:00 compute-0 nova_compute[187243]: 2025-12-03 00:05:00.755 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:01 compute-0 podman[214922]: 2025-12-03 00:05:01.101944627 +0000 UTC m=+0.054054786 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:05:01 compute-0 openstack_network_exporter[199746]: ERROR   00:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:05:01 compute-0 openstack_network_exporter[199746]: ERROR   00:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:05:01 compute-0 openstack_network_exporter[199746]: ERROR   00:05:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:05:01 compute-0 openstack_network_exporter[199746]: ERROR   00:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:05:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:05:01 compute-0 openstack_network_exporter[199746]: ERROR   00:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:05:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:05:02 compute-0 nova_compute[187243]: 2025-12-03 00:05:02.242 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:04 compute-0 sshd-session[214943]: Received disconnect from 61.220.235.10 port 34956:11: Bye Bye [preauth]
Dec 03 00:05:04 compute-0 sshd-session[214943]: Disconnected from authenticating user root 61.220.235.10 port 34956 [preauth]
Dec 03 00:05:05 compute-0 nova_compute[187243]: 2025-12-03 00:05:05.759 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:07 compute-0 nova_compute[187243]: 2025-12-03 00:05:07.274 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:08 compute-0 nova_compute[187243]: 2025-12-03 00:05:08.283 187247 DEBUG nova.compute.manager [None req-93b062d6-32f5-4550-8069-d2340b2848d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Dec 03 00:05:08 compute-0 nova_compute[187243]: 2025-12-03 00:05:08.339 187247 DEBUG nova.compute.provider_tree [None req-93b062d6-32f5-4550-8069-d2340b2848d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Updating resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 generation from 21 to 23 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 03 00:05:10 compute-0 podman[214945]: 2025-12-03 00:05:10.134961093 +0000 UTC m=+0.076867812 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:05:10 compute-0 nova_compute[187243]: 2025-12-03 00:05:10.763 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:11 compute-0 sshd-session[214971]: Invalid user khan from 23.95.37.90 port 59396
Dec 03 00:05:11 compute-0 sshd-session[214971]: Received disconnect from 23.95.37.90 port 59396:11: Bye Bye [preauth]
Dec 03 00:05:11 compute-0 sshd-session[214971]: Disconnected from invalid user khan 23.95.37.90 port 59396 [preauth]
Dec 03 00:05:12 compute-0 nova_compute[187243]: 2025-12-03 00:05:12.306 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:13 compute-0 podman[214973]: 2025-12-03 00:05:13.112650653 +0000 UTC m=+0.073985732 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Dec 03 00:05:13 compute-0 podman[214974]: 2025-12-03 00:05:13.113111794 +0000 UTC m=+0.070791374 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 03 00:05:15 compute-0 nova_compute[187243]: 2025-12-03 00:05:15.766 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:15 compute-0 nova_compute[187243]: 2025-12-03 00:05:15.807 187247 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Check if temp file /var/lib/nova/instances/tmp6w8eatjn exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:05:15 compute-0 nova_compute[187243]: 2025-12-03 00:05:15.811 187247 DEBUG nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6w8eatjn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5d86e858-6a62-411e-a8dc-dffcfa247bfc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:05:17 compute-0 nova_compute[187243]: 2025-12-03 00:05:17.342 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:20 compute-0 nova_compute[187243]: 2025-12-03 00:05:20.205 187247 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:20 compute-0 nova_compute[187243]: 2025-12-03 00:05:20.256 187247 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:20 compute-0 nova_compute[187243]: 2025-12-03 00:05:20.257 187247 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:20 compute-0 nova_compute[187243]: 2025-12-03 00:05:20.317 187247 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:20 compute-0 nova_compute[187243]: 2025-12-03 00:05:20.318 187247 DEBUG nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Preparing to wait for external event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:05:20 compute-0 nova_compute[187243]: 2025-12-03 00:05:20.318 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:20 compute-0 nova_compute[187243]: 2025-12-03 00:05:20.318 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:20 compute-0 nova_compute[187243]: 2025-12-03 00:05:20.319 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:20 compute-0 nova_compute[187243]: 2025-12-03 00:05:20.770 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:22 compute-0 nova_compute[187243]: 2025-12-03 00:05:22.344 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:25 compute-0 nova_compute[187243]: 2025-12-03 00:05:25.774 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:26 compute-0 nova_compute[187243]: 2025-12-03 00:05:26.835 187247 DEBUG nova.compute.manager [req-e6b28893-4780-4483-b645-6b04d9febf0e req-442d7243-177c-4505-b119-0d1411b06fe9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:26 compute-0 nova_compute[187243]: 2025-12-03 00:05:26.836 187247 DEBUG oslo_concurrency.lockutils [req-e6b28893-4780-4483-b645-6b04d9febf0e req-442d7243-177c-4505-b119-0d1411b06fe9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:26 compute-0 nova_compute[187243]: 2025-12-03 00:05:26.836 187247 DEBUG oslo_concurrency.lockutils [req-e6b28893-4780-4483-b645-6b04d9febf0e req-442d7243-177c-4505-b119-0d1411b06fe9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:26 compute-0 nova_compute[187243]: 2025-12-03 00:05:26.837 187247 DEBUG oslo_concurrency.lockutils [req-e6b28893-4780-4483-b645-6b04d9febf0e req-442d7243-177c-4505-b119-0d1411b06fe9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:26 compute-0 nova_compute[187243]: 2025-12-03 00:05:26.837 187247 DEBUG nova.compute.manager [req-e6b28893-4780-4483-b645-6b04d9febf0e req-442d7243-177c-4505-b119-0d1411b06fe9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] No event matching network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 in dict_keys([('network-vif-plugged', '3fc60c87-0094-403e-9fb0-564004da22b1')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:05:26 compute-0 nova_compute[187243]: 2025-12-03 00:05:26.838 187247 DEBUG nova.compute.manager [req-e6b28893-4780-4483-b645-6b04d9febf0e req-442d7243-177c-4505-b119-0d1411b06fe9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:05:26 compute-0 ovn_controller[95488]: 2025-12-03T00:05:26Z|00116|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 03 00:05:27 compute-0 podman[215026]: 2025-12-03 00:05:27.096896964 +0000 UTC m=+0.057356527 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 03 00:05:27 compute-0 nova_compute[187243]: 2025-12-03 00:05:27.377 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:28 compute-0 sshd-session[215024]: Invalid user deploy from 102.210.148.92 port 34684
Dec 03 00:05:28 compute-0 sshd-session[215024]: Received disconnect from 102.210.148.92 port 34684:11: Bye Bye [preauth]
Dec 03 00:05:28 compute-0 sshd-session[215024]: Disconnected from invalid user deploy 102.210.148.92 port 34684 [preauth]
Dec 03 00:05:28 compute-0 nova_compute[187243]: 2025-12-03 00:05:28.843 187247 INFO nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Took 8.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 03 00:05:28 compute-0 nova_compute[187243]: 2025-12-03 00:05:28.918 187247 DEBUG nova.compute.manager [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:28 compute-0 nova_compute[187243]: 2025-12-03 00:05:28.918 187247 DEBUG oslo_concurrency.lockutils [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:28 compute-0 nova_compute[187243]: 2025-12-03 00:05:28.919 187247 DEBUG oslo_concurrency.lockutils [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:28 compute-0 nova_compute[187243]: 2025-12-03 00:05:28.919 187247 DEBUG oslo_concurrency.lockutils [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:28 compute-0 nova_compute[187243]: 2025-12-03 00:05:28.919 187247 DEBUG nova.compute.manager [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Processing event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:05:28 compute-0 nova_compute[187243]: 2025-12-03 00:05:28.919 187247 DEBUG nova.compute.manager [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-changed-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:28 compute-0 nova_compute[187243]: 2025-12-03 00:05:28.919 187247 DEBUG nova.compute.manager [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Refreshing instance network info cache due to event network-changed-3fc60c87-0094-403e-9fb0-564004da22b1. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:05:28 compute-0 nova_compute[187243]: 2025-12-03 00:05:28.919 187247 DEBUG oslo_concurrency.lockutils [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:05:28 compute-0 nova_compute[187243]: 2025-12-03 00:05:28.920 187247 DEBUG oslo_concurrency.lockutils [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:05:28 compute-0 nova_compute[187243]: 2025-12-03 00:05:28.920 187247 DEBUG nova.network.neutron [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Refreshing network info cache for port 3fc60c87-0094-403e-9fb0-564004da22b1 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:05:28 compute-0 nova_compute[187243]: 2025-12-03 00:05:28.921 187247 DEBUG nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:05:29 compute-0 nova_compute[187243]: 2025-12-03 00:05:29.427 187247 WARNING neutronclient.v2_0.client [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:29 compute-0 nova_compute[187243]: 2025-12-03 00:05:29.433 187247 DEBUG nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6w8eatjn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5d86e858-6a62-411e-a8dc-dffcfa247bfc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(ad96c46c-250d-4dee-aab8-996ce344a8d0),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:05:29 compute-0 podman[197600]: time="2025-12-03T00:05:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:05:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:05:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:05:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:05:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3061 "" "Go-http-client/1.1"
Dec 03 00:05:29 compute-0 nova_compute[187243]: 2025-12-03 00:05:29.854 187247 WARNING neutronclient.v2_0.client [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:30 compute-0 nova_compute[187243]: 2025-12-03 00:05:30.557 187247 DEBUG nova.network.neutron [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Updated VIF entry in instance network info cache for port 3fc60c87-0094-403e-9fb0-564004da22b1. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:05:30 compute-0 nova_compute[187243]: 2025-12-03 00:05:30.557 187247 DEBUG nova.network.neutron [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Updating instance_info_cache with network_info: [{"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:05:30 compute-0 nova_compute[187243]: 2025-12-03 00:05:30.778 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:31 compute-0 openstack_network_exporter[199746]: ERROR   00:05:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:05:31 compute-0 openstack_network_exporter[199746]: ERROR   00:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:05:31 compute-0 openstack_network_exporter[199746]: ERROR   00:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.416 187247 DEBUG nova.objects.instance [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 5d86e858-6a62-411e-a8dc-dffcfa247bfc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.417 187247 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:05:31 compute-0 openstack_network_exporter[199746]: ERROR   00:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:05:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:05:31 compute-0 openstack_network_exporter[199746]: ERROR   00:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:05:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.419 187247 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.419 187247 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.694 187247 DEBUG oslo_concurrency.lockutils [req-09aaad93-aab5-4f41-af49-5f9341138b8a req-c60052eb-4105-43a6-98fe-28345a2f7cde 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-5d86e858-6a62-411e-a8dc-dffcfa247bfc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.921 187247 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.922 187247 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.932 187247 DEBUG nova.virt.libvirt.vif [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:04:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1547033723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1547033723',id=14,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:04:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-rbawllbh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:04:40Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=5d86e858-6a62-411e-a8dc-dffcfa247bfc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.933 187247 DEBUG nova.network.os_vif_util [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.934 187247 DEBUG nova.network.os_vif_util [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.935 187247 DEBUG nova.virt.libvirt.migration [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:6d:ee:e6"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <target dev="tap3fc60c87-00"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]: </interface>
Dec 03 00:05:31 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.936 187247 DEBUG nova.virt.libvirt.migration [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <name>instance-0000000e</name>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <uuid>5d86e858-6a62-411e-a8dc-dffcfa247bfc</uuid>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1547033723</nova:name>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:04:35</nova:creationTime>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:05:31 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:05:31 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:user uuid="ab182b4a69794d1fa103fbd3d503df99">tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin</nova:user>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:project uuid="85e2f91a92cf4b5a9d626e8418f17322">tempest-TestExecuteHostMaintenanceStrategy-1767783627</nova:project>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:port uuid="3fc60c87-0094-403e-9fb0-564004da22b1">
Dec 03 00:05:31 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <system>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="serial">5d86e858-6a62-411e-a8dc-dffcfa247bfc</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="uuid">5d86e858-6a62-411e-a8dc-dffcfa247bfc</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </system>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <os>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </os>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <features>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </features>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk.config"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:6d:ee:e6"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3fc60c87-00"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/console.log" append="off"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </target>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/console.log" append="off"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </console>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </input>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <video>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </video>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]: </domain>
Dec 03 00:05:31 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.938 187247 DEBUG nova.virt.libvirt.migration [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <name>instance-0000000e</name>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <uuid>5d86e858-6a62-411e-a8dc-dffcfa247bfc</uuid>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1547033723</nova:name>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:04:35</nova:creationTime>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:05:31 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:05:31 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:user uuid="ab182b4a69794d1fa103fbd3d503df99">tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin</nova:user>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:project uuid="85e2f91a92cf4b5a9d626e8418f17322">tempest-TestExecuteHostMaintenanceStrategy-1767783627</nova:project>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:port uuid="3fc60c87-0094-403e-9fb0-564004da22b1">
Dec 03 00:05:31 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <system>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="serial">5d86e858-6a62-411e-a8dc-dffcfa247bfc</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="uuid">5d86e858-6a62-411e-a8dc-dffcfa247bfc</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </system>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <os>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </os>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <features>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </features>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk.config"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:6d:ee:e6"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3fc60c87-00"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/console.log" append="off"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </target>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/console.log" append="off"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </console>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </input>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <video>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </video>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]: </domain>
Dec 03 00:05:31 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.938 187247 DEBUG nova.virt.libvirt.migration [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <name>instance-0000000e</name>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <uuid>5d86e858-6a62-411e-a8dc-dffcfa247bfc</uuid>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1547033723</nova:name>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:04:35</nova:creationTime>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:05:31 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:05:31 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:user uuid="ab182b4a69794d1fa103fbd3d503df99">tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin</nova:user>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:project uuid="85e2f91a92cf4b5a9d626e8418f17322">tempest-TestExecuteHostMaintenanceStrategy-1767783627</nova:project>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <nova:port uuid="3fc60c87-0094-403e-9fb0-564004da22b1">
Dec 03 00:05:31 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <system>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="serial">5d86e858-6a62-411e-a8dc-dffcfa247bfc</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="uuid">5d86e858-6a62-411e-a8dc-dffcfa247bfc</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </system>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <os>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </os>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <features>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </features>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/disk.config"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:6d:ee:e6"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3fc60c87-00"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/console.log" append="off"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:05:31 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       </target>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc/console.log" append="off"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </console>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </input>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <video>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </video>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:05:31 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:05:31 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:05:31 compute-0 nova_compute[187243]: </domain>
Dec 03 00:05:31 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:05:31 compute-0 nova_compute[187243]: 2025-12-03 00:05:31.938 187247 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:05:32 compute-0 podman[215047]: 2025-12-03 00:05:32.110860247 +0000 UTC m=+0.063968948 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 03 00:05:32 compute-0 nova_compute[187243]: 2025-12-03 00:05:32.414 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:32 compute-0 nova_compute[187243]: 2025-12-03 00:05:32.424 187247 DEBUG nova.virt.libvirt.migration [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:05:32 compute-0 nova_compute[187243]: 2025-12-03 00:05:32.425 187247 INFO nova.virt.libvirt.migration [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.466 187247 INFO nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:05:33 compute-0 kernel: tap3fc60c87-00 (unregistering): left promiscuous mode
Dec 03 00:05:33 compute-0 NetworkManager[55671]: <info>  [1764720333.4966] device (tap3fc60c87-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:05:33 compute-0 ovn_controller[95488]: 2025-12-03T00:05:33Z|00117|binding|INFO|Releasing lport 3fc60c87-0094-403e-9fb0-564004da22b1 from this chassis (sb_readonly=0)
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.562 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:33 compute-0 ovn_controller[95488]: 2025-12-03T00:05:33Z|00118|binding|INFO|Setting lport 3fc60c87-0094-403e-9fb0-564004da22b1 down in Southbound
Dec 03 00:05:33 compute-0 ovn_controller[95488]: 2025-12-03T00:05:33Z|00119|binding|INFO|Removing iface tap3fc60c87-00 ovn-installed in OVS
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.565 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.575 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:ee:e6 10.100.0.11'], port_security=['fa:16:3e:6d:ee:e6 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5d86e858-6a62-411e-a8dc-dffcfa247bfc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85e2f91a92cf4b5a9d626e8418f17322', 'neutron:revision_number': '10', 'neutron:security_group_ids': '2256d612-5a1d-4528-93f3-139a5d1ff76a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e46e490-abb3-4025-b870-a46519cde774, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=3fc60c87-0094-403e-9fb0-564004da22b1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.577 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 3fc60c87-0094-403e-9fb0-564004da22b1 in datapath ed11b71b-745b-4f0c-9f09-37d53d166bcb unbound from our chassis
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.579 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed11b71b-745b-4f0c-9f09-37d53d166bcb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.581 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[571cfba7-fd95-428b-8f7a-332dd1bcbf16]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.581 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb namespace which is not needed anymore
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.590 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:05:33 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Dec 03 00:05:33 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000e.scope: Consumed 14.952s CPU time.
Dec 03 00:05:33 compute-0 systemd-machined[153518]: Machine qemu-9-instance-0000000e terminated.
Dec 03 00:05:33 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214788]: [NOTICE]   (214792) : haproxy version is 3.0.5-8e879a5
Dec 03 00:05:33 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214788]: [NOTICE]   (214792) : path to executable is /usr/sbin/haproxy
Dec 03 00:05:33 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214788]: [WARNING]  (214792) : Exiting Master process...
Dec 03 00:05:33 compute-0 podman[215106]: 2025-12-03 00:05:33.73627921 +0000 UTC m=+0.048143553 container kill cf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 00:05:33 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214788]: [ALERT]    (214792) : Current worker (214794) exited with code 143 (Terminated)
Dec 03 00:05:33 compute-0 neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb[214788]: [WARNING]  (214792) : All workers exited. Exiting... (0)
Dec 03 00:05:33 compute-0 systemd[1]: libpod-cf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b.scope: Deactivated successfully.
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.746 187247 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.747 187247 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.747 187247 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:05:33 compute-0 podman[215138]: 2025-12-03 00:05:33.787449855 +0000 UTC m=+0.029718974 container died cf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true)
Dec 03 00:05:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b-userdata-shm.mount: Deactivated successfully.
Dec 03 00:05:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-6eaf3ec3651262a52b04424801621059840c5fc6715bd8e9c20ce7c1f1e162be-merged.mount: Deactivated successfully.
Dec 03 00:05:33 compute-0 podman[215138]: 2025-12-03 00:05:33.820304235 +0000 UTC m=+0.062573344 container cleanup cf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 03 00:05:33 compute-0 systemd[1]: libpod-conmon-cf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b.scope: Deactivated successfully.
Dec 03 00:05:33 compute-0 podman[215140]: 2025-12-03 00:05:33.839744308 +0000 UTC m=+0.069883002 container remove cf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest)
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.845 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[37e0e857-aecf-4068-b6f9-73fb4846a7fe]: (4, ("Wed Dec  3 12:05:33 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb (cf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b)\ncf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b\nWed Dec  3 12:05:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb (cf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b)\ncf9281efeb521f6230ad141064a39e80d1a09ed899d8c76854f5f97a2595469b\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.846 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[03111922-a998-42a7-ae25-21d0d70e1dfd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.847 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed11b71b-745b-4f0c-9f09-37d53d166bcb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.847 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b2a532-1d53-41c4-b58c-de11ce4d6dec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.849 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped11b71b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.851 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.874 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:33 compute-0 kernel: taped11b71b-70: left promiscuous mode
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.882 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.883 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a08b537d-97b0-4f3f-8f11-33d1c40cd3c4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.900 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[15664109-232e-4eb6-b9e2-90f1c01e9172]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.900 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6c86b850-c21a-4962-b9d2-9f85cd8c731a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.919 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[73483c04-5737-4692-934e-ac7fa7c49d79]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436893, 'reachable_time': 35061, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215173, 'error': None, 'target': 'ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.921 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ed11b71b-745b-4f0c-9f09-37d53d166bcb deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:05:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:33.921 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[e94aa718-206b-486a-875a-c912011f1e2c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:05:33 compute-0 systemd[1]: run-netns-ovnmeta\x2ded11b71b\x2d745b\x2d4f0c\x2d9f09\x2d37d53d166bcb.mount: Deactivated successfully.
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.968 187247 DEBUG nova.virt.libvirt.guest [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '5d86e858-6a62-411e-a8dc-dffcfa247bfc' (instance-0000000e) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.968 187247 INFO nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Migration operation has completed
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.968 187247 INFO nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] _post_live_migration() is started..
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.981 187247 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:33 compute-0 nova_compute[187243]: 2025-12-03 00:05:33.981 187247 WARNING neutronclient.v2_0.client [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.483 187247 DEBUG nova.compute.manager [req-1d659f18-2b79-4fc3-8088-53148b5ad4e4 req-4af3eeff-219f-4a63-ba04-a0f2aea92678 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.483 187247 DEBUG oslo_concurrency.lockutils [req-1d659f18-2b79-4fc3-8088-53148b5ad4e4 req-4af3eeff-219f-4a63-ba04-a0f2aea92678 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.484 187247 DEBUG oslo_concurrency.lockutils [req-1d659f18-2b79-4fc3-8088-53148b5ad4e4 req-4af3eeff-219f-4a63-ba04-a0f2aea92678 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.484 187247 DEBUG oslo_concurrency.lockutils [req-1d659f18-2b79-4fc3-8088-53148b5ad4e4 req-4af3eeff-219f-4a63-ba04-a0f2aea92678 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.485 187247 DEBUG nova.compute.manager [req-1d659f18-2b79-4fc3-8088-53148b5ad4e4 req-4af3eeff-219f-4a63-ba04-a0f2aea92678 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] No waiting events found dispatching network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.485 187247 DEBUG nova.compute.manager [req-1d659f18-2b79-4fc3-8088-53148b5ad4e4 req-4af3eeff-219f-4a63-ba04-a0f2aea92678 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:34 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:34.621 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:05:34 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:34.623 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:05:34 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:05:34.625 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.649 187247 DEBUG nova.compute.manager [req-2bbaa44c-be3d-4307-86fc-73cdfe1de358 req-5a6a235b-b871-4737-a862-7eebcd8edbf9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.649 187247 DEBUG oslo_concurrency.lockutils [req-2bbaa44c-be3d-4307-86fc-73cdfe1de358 req-5a6a235b-b871-4737-a862-7eebcd8edbf9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.650 187247 DEBUG oslo_concurrency.lockutils [req-2bbaa44c-be3d-4307-86fc-73cdfe1de358 req-5a6a235b-b871-4737-a862-7eebcd8edbf9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.650 187247 DEBUG oslo_concurrency.lockutils [req-2bbaa44c-be3d-4307-86fc-73cdfe1de358 req-5a6a235b-b871-4737-a862-7eebcd8edbf9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.650 187247 DEBUG nova.compute.manager [req-2bbaa44c-be3d-4307-86fc-73cdfe1de358 req-5a6a235b-b871-4737-a862-7eebcd8edbf9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] No waiting events found dispatching network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.650 187247 DEBUG nova.compute.manager [req-2bbaa44c-be3d-4307-86fc-73cdfe1de358 req-5a6a235b-b871-4737-a862-7eebcd8edbf9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:05:34 compute-0 nova_compute[187243]: 2025-12-03 00:05:34.654 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.583 187247 DEBUG nova.network.neutron [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port 3fc60c87-0094-403e-9fb0-564004da22b1 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.583 187247 DEBUG nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.584 187247 DEBUG nova.virt.libvirt.vif [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:04:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1547033723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1547033723',id=14,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:04:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='85e2f91a92cf4b5a9d626e8418f17322',ramdisk_id='',reservation_id='r-rbawllbh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1767783627-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:05:11Z,user_data=None,user_id='ab182b4a69794d1fa103fbd3d503df99',uuid=5d86e858-6a62-411e-a8dc-dffcfa247bfc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.585 187247 DEBUG nova.network.os_vif_util [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "3fc60c87-0094-403e-9fb0-564004da22b1", "address": "fa:16:3e:6d:ee:e6", "network": {"id": "ed11b71b-745b-4f0c-9f09-37d53d166bcb", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2033289530-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "722828099f1644218029b73eaf67d6b4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3fc60c87-00", "ovs_interfaceid": "3fc60c87-0094-403e-9fb0-564004da22b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.585 187247 DEBUG nova.network.os_vif_util [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.586 187247 DEBUG os_vif [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.588 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.589 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3fc60c87-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.590 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.592 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.593 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.593 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=38a0319a-9d81-467a-baec-72f5b209e699) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.594 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.596 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.599 187247 INFO os_vif [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ee:e6,bridge_name='br-int',has_traffic_filtering=True,id=3fc60c87-0094-403e-9fb0-564004da22b1,network=Network(ed11b71b-745b-4f0c-9f09-37d53d166bcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3fc60c87-00')
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.599 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.600 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.600 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.600 187247 DEBUG nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.601 187247 INFO nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Deleting instance files /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc_del
Dec 03 00:05:35 compute-0 nova_compute[187243]: 2025-12-03 00:05:35.602 187247 INFO nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Deletion of /var/lib/nova/instances/5d86e858-6a62-411e-a8dc-dffcfa247bfc_del complete
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.606 187247 DEBUG nova.compute.manager [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.607 187247 DEBUG oslo_concurrency.lockutils [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.607 187247 DEBUG oslo_concurrency.lockutils [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.607 187247 DEBUG oslo_concurrency.lockutils [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.607 187247 DEBUG nova.compute.manager [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] No waiting events found dispatching network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.608 187247 WARNING nova.compute.manager [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received unexpected event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 for instance with vm_state active and task_state migrating.
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.608 187247 DEBUG nova.compute.manager [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.608 187247 DEBUG oslo_concurrency.lockutils [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.608 187247 DEBUG oslo_concurrency.lockutils [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.609 187247 DEBUG oslo_concurrency.lockutils [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.609 187247 DEBUG nova.compute.manager [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] No waiting events found dispatching network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.609 187247 DEBUG nova.compute.manager [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-unplugged-3fc60c87-0094-403e-9fb0-564004da22b1 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.610 187247 DEBUG nova.compute.manager [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.610 187247 DEBUG oslo_concurrency.lockutils [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.610 187247 DEBUG oslo_concurrency.lockutils [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.610 187247 DEBUG oslo_concurrency.lockutils [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.611 187247 DEBUG nova.compute.manager [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] No waiting events found dispatching network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:05:36 compute-0 nova_compute[187243]: 2025-12-03 00:05:36.611 187247 WARNING nova.compute.manager [req-c19c37f3-ce0b-4719-a764-7848b07504e4 req-26df2596-2966-4f6c-99d1-cdd299c85d9c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received unexpected event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 for instance with vm_state active and task_state migrating.
Dec 03 00:05:37 compute-0 nova_compute[187243]: 2025-12-03 00:05:37.416 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:37 compute-0 nova_compute[187243]: 2025-12-03 00:05:37.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:38 compute-0 nova_compute[187243]: 2025-12-03 00:05:38.587 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:38 compute-0 nova_compute[187243]: 2025-12-03 00:05:38.667 187247 DEBUG nova.compute.manager [req-44e0b1de-1e88-42ac-9552-b6ec3ba2b481 req-b6b61ad2-7dab-4b78-ad65-6307543c24de 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:05:38 compute-0 nova_compute[187243]: 2025-12-03 00:05:38.667 187247 DEBUG oslo_concurrency.lockutils [req-44e0b1de-1e88-42ac-9552-b6ec3ba2b481 req-b6b61ad2-7dab-4b78-ad65-6307543c24de 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:38 compute-0 nova_compute[187243]: 2025-12-03 00:05:38.667 187247 DEBUG oslo_concurrency.lockutils [req-44e0b1de-1e88-42ac-9552-b6ec3ba2b481 req-b6b61ad2-7dab-4b78-ad65-6307543c24de 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:38 compute-0 nova_compute[187243]: 2025-12-03 00:05:38.668 187247 DEBUG oslo_concurrency.lockutils [req-44e0b1de-1e88-42ac-9552-b6ec3ba2b481 req-b6b61ad2-7dab-4b78-ad65-6307543c24de 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:38 compute-0 nova_compute[187243]: 2025-12-03 00:05:38.668 187247 DEBUG nova.compute.manager [req-44e0b1de-1e88-42ac-9552-b6ec3ba2b481 req-b6b61ad2-7dab-4b78-ad65-6307543c24de 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] No waiting events found dispatching network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:05:38 compute-0 nova_compute[187243]: 2025-12-03 00:05:38.669 187247 WARNING nova.compute.manager [req-44e0b1de-1e88-42ac-9552-b6ec3ba2b481 req-b6b61ad2-7dab-4b78-ad65-6307543c24de 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Received unexpected event network-vif-plugged-3fc60c87-0094-403e-9fb0-564004da22b1 for instance with vm_state active and task_state migrating.
Dec 03 00:05:40 compute-0 nova_compute[187243]: 2025-12-03 00:05:40.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:40 compute-0 nova_compute[187243]: 2025-12-03 00:05:40.595 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:41 compute-0 podman[215177]: 2025-12-03 00:05:41.106398459 +0000 UTC m=+0.057239673 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:05:41 compute-0 nova_compute[187243]: 2025-12-03 00:05:41.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:41 compute-0 nova_compute[187243]: 2025-12-03 00:05:41.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:42 compute-0 nova_compute[187243]: 2025-12-03 00:05:42.417 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:43 compute-0 nova_compute[187243]: 2025-12-03 00:05:43.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:05:44 compute-0 podman[215201]: 2025-12-03 00:05:44.082417469 +0000 UTC m=+0.045202621 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 03 00:05:44 compute-0 nova_compute[187243]: 2025-12-03 00:05:44.104 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:44 compute-0 nova_compute[187243]: 2025-12-03 00:05:44.104 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:44 compute-0 nova_compute[187243]: 2025-12-03 00:05:44.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:44 compute-0 nova_compute[187243]: 2025-12-03 00:05:44.105 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:05:44 compute-0 podman[215202]: 2025-12-03 00:05:44.114319035 +0000 UTC m=+0.074496333 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 03 00:05:44 compute-0 nova_compute[187243]: 2025-12-03 00:05:44.242 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:05:44 compute-0 nova_compute[187243]: 2025-12-03 00:05:44.243 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:44 compute-0 nova_compute[187243]: 2025-12-03 00:05:44.261 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:44 compute-0 nova_compute[187243]: 2025-12-03 00:05:44.262 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5822MB free_disk=73.16488265991211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:05:44 compute-0 nova_compute[187243]: 2025-12-03 00:05:44.262 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:44 compute-0 nova_compute[187243]: 2025-12-03 00:05:44.263 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:44 compute-0 nova_compute[187243]: 2025-12-03 00:05:44.636 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:44 compute-0 nova_compute[187243]: 2025-12-03 00:05:44.637 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:44 compute-0 nova_compute[187243]: 2025-12-03 00:05:44.637 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5d86e858-6a62-411e-a8dc-dffcfa247bfc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:45 compute-0 nova_compute[187243]: 2025-12-03 00:05:45.147 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:45 compute-0 nova_compute[187243]: 2025-12-03 00:05:45.281 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration for instance 5d86e858-6a62-411e-a8dc-dffcfa247bfc refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:05:45 compute-0 nova_compute[187243]: 2025-12-03 00:05:45.598 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:45 compute-0 nova_compute[187243]: 2025-12-03 00:05:45.787 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:05:45 compute-0 nova_compute[187243]: 2025-12-03 00:05:45.817 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration ad96c46c-250d-4dee-aab8-996ce344a8d0 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:05:45 compute-0 nova_compute[187243]: 2025-12-03 00:05:45.818 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:05:45 compute-0 nova_compute[187243]: 2025-12-03 00:05:45.818 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:05:44 up  1:13,  0 user,  load average: 0.17, 0.19, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:05:45 compute-0 nova_compute[187243]: 2025-12-03 00:05:45.857 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:05:46 compute-0 nova_compute[187243]: 2025-12-03 00:05:46.363 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:05:46 compute-0 nova_compute[187243]: 2025-12-03 00:05:46.872 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:05:46 compute-0 nova_compute[187243]: 2025-12-03 00:05:46.872 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.610s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:46 compute-0 nova_compute[187243]: 2025-12-03 00:05:46.873 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 1.726s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:46 compute-0 nova_compute[187243]: 2025-12-03 00:05:46.873 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:46 compute-0 nova_compute[187243]: 2025-12-03 00:05:46.874 187247 DEBUG nova.compute.resource_tracker [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:05:47 compute-0 nova_compute[187243]: 2025-12-03 00:05:47.012 187247 WARNING nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:05:47 compute-0 nova_compute[187243]: 2025-12-03 00:05:47.013 187247 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:05:47 compute-0 nova_compute[187243]: 2025-12-03 00:05:47.034 187247 DEBUG oslo_concurrency.processutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:05:47 compute-0 nova_compute[187243]: 2025-12-03 00:05:47.035 187247 DEBUG nova.compute.resource_tracker [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5837MB free_disk=73.16494750976562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:05:47 compute-0 nova_compute[187243]: 2025-12-03 00:05:47.036 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:05:47 compute-0 nova_compute[187243]: 2025-12-03 00:05:47.036 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:05:47 compute-0 nova_compute[187243]: 2025-12-03 00:05:47.418 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:48 compute-0 sshd-session[215175]: Received disconnect from 45.78.218.154 port 39926:11: Bye Bye [preauth]
Dec 03 00:05:48 compute-0 sshd-session[215175]: Disconnected from 45.78.218.154 port 39926 [preauth]
Dec 03 00:05:48 compute-0 nova_compute[187243]: 2025-12-03 00:05:48.055 187247 DEBUG nova.compute.resource_tracker [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance 5d86e858-6a62-411e-a8dc-dffcfa247bfc refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:05:48 compute-0 nova_compute[187243]: 2025-12-03 00:05:48.567 187247 DEBUG nova.compute.resource_tracker [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:05:48 compute-0 nova_compute[187243]: 2025-12-03 00:05:48.581 187247 DEBUG nova.compute.resource_tracker [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration ad96c46c-250d-4dee-aab8-996ce344a8d0 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:05:48 compute-0 nova_compute[187243]: 2025-12-03 00:05:48.581 187247 DEBUG nova.compute.resource_tracker [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:05:48 compute-0 nova_compute[187243]: 2025-12-03 00:05:48.582 187247 DEBUG nova.compute.resource_tracker [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:05:47 up  1:13,  0 user,  load average: 0.15, 0.19, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:05:48 compute-0 nova_compute[187243]: 2025-12-03 00:05:48.628 187247 DEBUG nova.compute.provider_tree [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:05:49 compute-0 nova_compute[187243]: 2025-12-03 00:05:49.135 187247 DEBUG nova.scheduler.client.report [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:05:49 compute-0 nova_compute[187243]: 2025-12-03 00:05:49.650 187247 DEBUG nova.compute.resource_tracker [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:05:49 compute-0 nova_compute[187243]: 2025-12-03 00:05:49.651 187247 DEBUG oslo_concurrency.lockutils [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.615s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:05:49 compute-0 nova_compute[187243]: 2025-12-03 00:05:49.673 187247 INFO nova.compute.manager [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 03 00:05:50 compute-0 nova_compute[187243]: 2025-12-03 00:05:50.601 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:50 compute-0 nova_compute[187243]: 2025-12-03 00:05:50.766 187247 INFO nova.scheduler.client.report [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration ad96c46c-250d-4dee-aab8-996ce344a8d0
Dec 03 00:05:50 compute-0 nova_compute[187243]: 2025-12-03 00:05:50.766 187247 DEBUG nova.virt.libvirt.driver [None req-10c9ee09-3450-49c2-bf61-76baa73b78c1 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5d86e858-6a62-411e-a8dc-dffcfa247bfc] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:05:52 compute-0 nova_compute[187243]: 2025-12-03 00:05:52.460 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:55 compute-0 nova_compute[187243]: 2025-12-03 00:05:55.605 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:57 compute-0 nova_compute[187243]: 2025-12-03 00:05:57.497 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:05:58 compute-0 podman[215249]: 2025-12-03 00:05:58.112833722 +0000 UTC m=+0.069386970 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, release=1755695350, distribution-scope=public, vcs-type=git, io.openshift.tags=minimal rhel9)
Dec 03 00:05:59 compute-0 sshd-session[215273]: Invalid user builder from 20.123.120.169 port 45540
Dec 03 00:05:59 compute-0 sshd-session[215273]: Received disconnect from 20.123.120.169 port 45540:11: Bye Bye [preauth]
Dec 03 00:05:59 compute-0 sshd-session[215273]: Disconnected from invalid user builder 20.123.120.169 port 45540 [preauth]
Dec 03 00:05:59 compute-0 sshd-session[215254]: Invalid user scan from 49.247.36.49 port 32685
Dec 03 00:05:59 compute-0 sshd-session[215254]: Received disconnect from 49.247.36.49 port 32685:11: Bye Bye [preauth]
Dec 03 00:05:59 compute-0 sshd-session[215254]: Disconnected from invalid user scan 49.247.36.49 port 32685 [preauth]
Dec 03 00:05:59 compute-0 podman[197600]: time="2025-12-03T00:05:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:05:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:05:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:05:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:05:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Dec 03 00:06:00 compute-0 nova_compute[187243]: 2025-12-03 00:06:00.609 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:00.694 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:00.694 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:00.694 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:01 compute-0 openstack_network_exporter[199746]: ERROR   00:06:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:06:01 compute-0 openstack_network_exporter[199746]: ERROR   00:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:06:01 compute-0 openstack_network_exporter[199746]: ERROR   00:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:06:01 compute-0 openstack_network_exporter[199746]: ERROR   00:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:06:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:06:01 compute-0 openstack_network_exporter[199746]: ERROR   00:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:06:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:06:02 compute-0 nova_compute[187243]: 2025-12-03 00:06:02.529 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:03 compute-0 podman[215277]: 2025-12-03 00:06:03.15535368 +0000 UTC m=+0.097506962 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 03 00:06:03 compute-0 nova_compute[187243]: 2025-12-03 00:06:03.480 187247 DEBUG nova.compute.manager [None req-e9859263-a354-4a90-a769-67b08646d58e 7ede684cab6e46758f9d1100711cfe79 22106c97f2524355a0bbadb98eaf5c22 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Dec 03 00:06:03 compute-0 nova_compute[187243]: 2025-12-03 00:06:03.523 187247 DEBUG nova.compute.provider_tree [None req-e9859263-a354-4a90-a769-67b08646d58e 7ede684cab6e46758f9d1100711cfe79 22106c97f2524355a0bbadb98eaf5c22 - - default default] Updating resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 generation from 23 to 26 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 03 00:06:05 compute-0 nova_compute[187243]: 2025-12-03 00:06:05.612 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:07 compute-0 nova_compute[187243]: 2025-12-03 00:06:07.150 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:07 compute-0 nova_compute[187243]: 2025-12-03 00:06:07.532 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:10 compute-0 nova_compute[187243]: 2025-12-03 00:06:10.615 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:12 compute-0 podman[215298]: 2025-12-03 00:06:12.133830776 +0000 UTC m=+0.077340473 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:06:12 compute-0 nova_compute[187243]: 2025-12-03 00:06:12.535 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:15 compute-0 podman[215322]: 2025-12-03 00:06:15.096814228 +0000 UTC m=+0.057306325 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:06:15 compute-0 podman[215323]: 2025-12-03 00:06:15.152424041 +0000 UTC m=+0.110995131 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 00:06:15 compute-0 nova_compute[187243]: 2025-12-03 00:06:15.617 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:17 compute-0 nova_compute[187243]: 2025-12-03 00:06:17.572 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:19 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:19.576 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:f8:06 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b356b9112e0c4e6083f56fc1c7796972', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=df9da247-f3c2-412c-95a4-9a2562c93dd4) old=Port_Binding(mac=['fa:16:3e:04:f8:06'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b356b9112e0c4e6083f56fc1c7796972', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:06:19 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:19.578 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port df9da247-f3c2-412c-95a4-9a2562c93dd4 in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 updated
Dec 03 00:06:19 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:19.579 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:06:19 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:19.581 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6a4b4484-b25d-40cc-86b9-ace57d76cb43]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:20 compute-0 nova_compute[187243]: 2025-12-03 00:06:20.620 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:22 compute-0 nova_compute[187243]: 2025-12-03 00:06:22.629 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:25 compute-0 sshd-session[215368]: Invalid user elsearch from 45.78.219.213 port 39536
Dec 03 00:06:25 compute-0 nova_compute[187243]: 2025-12-03 00:06:25.623 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:25 compute-0 sshd-session[215368]: Received disconnect from 45.78.219.213 port 39536:11: Bye Bye [preauth]
Dec 03 00:06:25 compute-0 sshd-session[215368]: Disconnected from invalid user elsearch 45.78.219.213 port 39536 [preauth]
Dec 03 00:06:27 compute-0 sshd-session[215370]: Connection closed by 45.78.219.95 port 47002 [preauth]
Dec 03 00:06:27 compute-0 nova_compute[187243]: 2025-12-03 00:06:27.633 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:28.802 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:17:82 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-633013f1-c17e-45b0-841b-2c82c9dddeea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-633013f1-c17e-45b0-841b-2c82c9dddeea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6cd602f0-6d27-4f32-958a-fa46ec296bd3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=94de005e-3a8b-4f10-829a-627ec3895f56) old=Port_Binding(mac=['fa:16:3e:36:17:82'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-633013f1-c17e-45b0-841b-2c82c9dddeea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-633013f1-c17e-45b0-841b-2c82c9dddeea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:06:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:28.803 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 94de005e-3a8b-4f10-829a-627ec3895f56 in datapath 633013f1-c17e-45b0-841b-2c82c9dddeea updated
Dec 03 00:06:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:28.805 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 633013f1-c17e-45b0-841b-2c82c9dddeea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:06:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:28.806 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1a41bca4-740c-4277-92b3-7da5765f2c71]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:29 compute-0 podman[215372]: 2025-12-03 00:06:29.171950589 +0000 UTC m=+0.119672473 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 03 00:06:29 compute-0 podman[197600]: time="2025-12-03T00:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:06:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:06:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2595 "" "Go-http-client/1.1"
Dec 03 00:06:30 compute-0 nova_compute[187243]: 2025-12-03 00:06:30.627 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:31 compute-0 openstack_network_exporter[199746]: ERROR   00:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:06:31 compute-0 openstack_network_exporter[199746]: ERROR   00:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:06:31 compute-0 openstack_network_exporter[199746]: ERROR   00:06:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:06:31 compute-0 openstack_network_exporter[199746]: ERROR   00:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:06:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:06:31 compute-0 openstack_network_exporter[199746]: ERROR   00:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:06:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:06:32 compute-0 nova_compute[187243]: 2025-12-03 00:06:32.702 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:33 compute-0 sshd-session[215394]: Invalid user root1 from 23.95.37.90 port 53996
Dec 03 00:06:33 compute-0 sshd-session[215394]: Received disconnect from 23.95.37.90 port 53996:11: Bye Bye [preauth]
Dec 03 00:06:33 compute-0 sshd-session[215394]: Disconnected from invalid user root1 23.95.37.90 port 53996 [preauth]
Dec 03 00:06:33 compute-0 podman[215396]: 2025-12-03 00:06:33.885608334 +0000 UTC m=+0.071971573 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 03 00:06:34 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:34.991 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:06:34 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:34.992 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:06:34 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:34.993 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:06:35 compute-0 nova_compute[187243]: 2025-12-03 00:06:35.031 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:35 compute-0 nova_compute[187243]: 2025-12-03 00:06:35.629 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:37 compute-0 nova_compute[187243]: 2025-12-03 00:06:37.704 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:37 compute-0 nova_compute[187243]: 2025-12-03 00:06:37.875 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:37 compute-0 nova_compute[187243]: 2025-12-03 00:06:37.875 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:37 compute-0 nova_compute[187243]: 2025-12-03 00:06:37.875 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:37 compute-0 nova_compute[187243]: 2025-12-03 00:06:37.875 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:06:39 compute-0 nova_compute[187243]: 2025-12-03 00:06:39.589 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:40 compute-0 nova_compute[187243]: 2025-12-03 00:06:40.291 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:40 compute-0 nova_compute[187243]: 2025-12-03 00:06:40.291 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:40 compute-0 nova_compute[187243]: 2025-12-03 00:06:40.632 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:40 compute-0 ovn_controller[95488]: 2025-12-03T00:06:40Z|00120|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Dec 03 00:06:40 compute-0 nova_compute[187243]: 2025-12-03 00:06:40.799 187247 DEBUG nova.compute.manager [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:06:41 compute-0 sshd-session[215419]: Invalid user webuser from 61.220.235.10 port 34114
Dec 03 00:06:41 compute-0 sshd-session[215417]: Received disconnect from 102.210.148.92 port 51204:11: Bye Bye [preauth]
Dec 03 00:06:41 compute-0 sshd-session[215417]: Disconnected from authenticating user root 102.210.148.92 port 51204 [preauth]
Dec 03 00:06:41 compute-0 sshd-session[215419]: Received disconnect from 61.220.235.10 port 34114:11: Bye Bye [preauth]
Dec 03 00:06:41 compute-0 sshd-session[215419]: Disconnected from invalid user webuser 61.220.235.10 port 34114 [preauth]
Dec 03 00:06:41 compute-0 nova_compute[187243]: 2025-12-03 00:06:41.346 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:41 compute-0 nova_compute[187243]: 2025-12-03 00:06:41.347 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:41 compute-0 nova_compute[187243]: 2025-12-03 00:06:41.356 187247 DEBUG nova.virt.hardware [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:06:41 compute-0 nova_compute[187243]: 2025-12-03 00:06:41.356 187247 INFO nova.compute.claims [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:06:41 compute-0 nova_compute[187243]: 2025-12-03 00:06:41.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:42 compute-0 nova_compute[187243]: 2025-12-03 00:06:42.386 187247 DEBUG nova.scheduler.client.report [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Refreshing inventories for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:06:42 compute-0 nova_compute[187243]: 2025-12-03 00:06:42.397 187247 DEBUG nova.scheduler.client.report [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Updating ProviderTree inventory for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:06:42 compute-0 nova_compute[187243]: 2025-12-03 00:06:42.397 187247 DEBUG nova.compute.provider_tree [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:06:42 compute-0 nova_compute[187243]: 2025-12-03 00:06:42.413 187247 DEBUG nova.scheduler.client.report [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Refreshing aggregate associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:06:42 compute-0 nova_compute[187243]: 2025-12-03 00:06:42.439 187247 DEBUG nova.scheduler.client.report [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Refreshing trait associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_SOUND_MODEL_AC97,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOUND_MODEL_ICH9,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:06:42 compute-0 nova_compute[187243]: 2025-12-03 00:06:42.469 187247 DEBUG nova.compute.provider_tree [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:06:42 compute-0 nova_compute[187243]: 2025-12-03 00:06:42.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:42 compute-0 nova_compute[187243]: 2025-12-03 00:06:42.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:42 compute-0 nova_compute[187243]: 2025-12-03 00:06:42.705 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:43 compute-0 nova_compute[187243]: 2025-12-03 00:06:43.062 187247 DEBUG nova.scheduler.client.report [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:06:43 compute-0 podman[215421]: 2025-12-03 00:06:43.131515298 +0000 UTC m=+0.079984808 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:06:43 compute-0 nova_compute[187243]: 2025-12-03 00:06:43.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:43 compute-0 nova_compute[187243]: 2025-12-03 00:06:43.785 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.438s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:43 compute-0 nova_compute[187243]: 2025-12-03 00:06:43.785 187247 DEBUG nova.compute.manager [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.102 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.104 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.104 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.104 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.296 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.297 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.329 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.330 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5851MB free_disk=73.1649284362793GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.331 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.331 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.371 187247 DEBUG nova.compute.manager [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.372 187247 DEBUG nova.network.neutron [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.374 187247 WARNING neutronclient.v2_0.client [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.375 187247 WARNING neutronclient.v2_0.client [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:06:44 compute-0 nova_compute[187243]: 2025-12-03 00:06:44.886 187247 INFO nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:06:45 compute-0 nova_compute[187243]: 2025-12-03 00:06:45.378 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance 5187b0f8-a8d1-4c99-a0b9-809caf89b88a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:06:45 compute-0 nova_compute[187243]: 2025-12-03 00:06:45.378 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:06:45 compute-0 nova_compute[187243]: 2025-12-03 00:06:45.379 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:06:44 up  1:14,  0 user,  load average: 0.09, 0.17, 0.30\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_networking': '1', 'num_os_type_None': '1', 'num_proj_869170c9b0864bd8a0f2258e90e55a84': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:06:45 compute-0 nova_compute[187243]: 2025-12-03 00:06:45.397 187247 DEBUG nova.compute.manager [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:06:45 compute-0 nova_compute[187243]: 2025-12-03 00:06:45.428 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:06:45 compute-0 nova_compute[187243]: 2025-12-03 00:06:45.530 187247 DEBUG nova.network.neutron [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Successfully created port: ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:06:45 compute-0 nova_compute[187243]: 2025-12-03 00:06:45.635 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:45 compute-0 nova_compute[187243]: 2025-12-03 00:06:45.937 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.101 187247 DEBUG nova.network.neutron [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Successfully updated port: ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:06:46 compute-0 podman[215446]: 2025-12-03 00:06:46.126312325 +0000 UTC m=+0.072072035 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.171 187247 DEBUG nova.compute.manager [req-804f034d-b9d1-48b1-93ec-9e9e84220b91 req-55472bbd-6e99-45b0-8152-e8c03d912ae7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-changed-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.172 187247 DEBUG nova.compute.manager [req-804f034d-b9d1-48b1-93ec-9e9e84220b91 req-55472bbd-6e99-45b0-8152-e8c03d912ae7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Refreshing instance network info cache due to event network-changed-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.173 187247 DEBUG oslo_concurrency.lockutils [req-804f034d-b9d1-48b1-93ec-9e9e84220b91 req-55472bbd-6e99-45b0-8152-e8c03d912ae7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.173 187247 DEBUG oslo_concurrency.lockutils [req-804f034d-b9d1-48b1-93ec-9e9e84220b91 req-55472bbd-6e99-45b0-8152-e8c03d912ae7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.173 187247 DEBUG nova.network.neutron [req-804f034d-b9d1-48b1-93ec-9e9e84220b91 req-55472bbd-6e99-45b0-8152-e8c03d912ae7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Refreshing network info cache for port ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:06:46 compute-0 podman[215447]: 2025-12-03 00:06:46.176427094 +0000 UTC m=+0.127468093 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.431 187247 DEBUG nova.compute.manager [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.434 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.434 187247 INFO nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Creating image(s)
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.436 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.436 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.438 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.439 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.447 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.451 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.451 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.452 187247 DEBUG oslo_concurrency.processutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.511 187247 DEBUG oslo_concurrency.processutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.514 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.515 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.516 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.522 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.523 187247 DEBUG oslo_concurrency.processutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.577 187247 DEBUG oslo_concurrency.processutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.578 187247 DEBUG oslo_concurrency.processutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.609 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.643 187247 DEBUG oslo_concurrency.processutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk 1073741824" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.645 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.646 187247 DEBUG oslo_concurrency.processutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.680 187247 WARNING neutronclient.v2_0.client [req-804f034d-b9d1-48b1-93ec-9e9e84220b91 req-55472bbd-6e99-45b0-8152-e8c03d912ae7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.747 187247 DEBUG oslo_concurrency.processutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.748 187247 DEBUG nova.virt.disk.api [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Checking if we can resize image /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.749 187247 DEBUG oslo_concurrency.processutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.846 187247 DEBUG oslo_concurrency.processutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.847 187247 DEBUG nova.virt.disk.api [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Cannot resize image /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.848 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.848 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Ensure instance console log exists: /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.849 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.849 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:46 compute-0 nova_compute[187243]: 2025-12-03 00:06:46.849 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:47 compute-0 nova_compute[187243]: 2025-12-03 00:06:47.556 187247 DEBUG nova.network.neutron [req-804f034d-b9d1-48b1-93ec-9e9e84220b91 req-55472bbd-6e99-45b0-8152-e8c03d912ae7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:06:47 compute-0 nova_compute[187243]: 2025-12-03 00:06:47.707 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:48 compute-0 nova_compute[187243]: 2025-12-03 00:06:48.215 187247 DEBUG nova.network.neutron [req-804f034d-b9d1-48b1-93ec-9e9e84220b91 req-55472bbd-6e99-45b0-8152-e8c03d912ae7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:06:48 compute-0 nova_compute[187243]: 2025-12-03 00:06:48.899 187247 DEBUG oslo_concurrency.lockutils [req-804f034d-b9d1-48b1-93ec-9e9e84220b91 req-55472bbd-6e99-45b0-8152-e8c03d912ae7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:06:48 compute-0 nova_compute[187243]: 2025-12-03 00:06:48.900 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquired lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:06:48 compute-0 nova_compute[187243]: 2025-12-03 00:06:48.900 187247 DEBUG nova.network.neutron [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:06:49 compute-0 nova_compute[187243]: 2025-12-03 00:06:49.817 187247 DEBUG nova.network.neutron [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.073 187247 WARNING neutronclient.v2_0.client [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.226 187247 DEBUG nova.network.neutron [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Updating instance_info_cache with network_info: [{"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.639 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.733 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Releasing lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.734 187247 DEBUG nova.compute.manager [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Instance network_info: |[{"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.736 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Start _get_guest_xml network_info=[{"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.740 187247 WARNING nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.741 187247 DEBUG nova.virt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1298851656', uuid='5187b0f8-a8d1-4c99-a0b9-809caf89b88a'), owner=OwnerMeta(userid='d7f72082c96e4f868d5b158a57237cee', username='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin', projectid='869170c9b0864bd8a0f2258e90e55a84', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720410.7418714) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.745 187247 DEBUG nova.virt.libvirt.host [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.746 187247 DEBUG nova.virt.libvirt.host [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.749 187247 DEBUG nova.virt.libvirt.host [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.749 187247 DEBUG nova.virt.libvirt.host [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.750 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.751 187247 DEBUG nova.virt.hardware [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.751 187247 DEBUG nova.virt.hardware [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.752 187247 DEBUG nova.virt.hardware [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.752 187247 DEBUG nova.virt.hardware [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.752 187247 DEBUG nova.virt.hardware [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.752 187247 DEBUG nova.virt.hardware [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.753 187247 DEBUG nova.virt.hardware [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.753 187247 DEBUG nova.virt.hardware [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.753 187247 DEBUG nova.virt.hardware [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.753 187247 DEBUG nova.virt.hardware [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.754 187247 DEBUG nova.virt.hardware [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.758 187247 DEBUG nova.virt.libvirt.vif [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:06:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1298851656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-129',id=16,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-0lswcfs9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:06:45Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=5187b0f8-a8d1-4c99-a0b9-809caf89b88a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.758 187247 DEBUG nova.network.os_vif_util [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converting VIF {"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.759 187247 DEBUG nova.network.os_vif_util [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:06:50 compute-0 nova_compute[187243]: 2025-12-03 00:06:50.760 187247 DEBUG nova.objects.instance [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5187b0f8-a8d1-4c99-a0b9-809caf89b88a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.272 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:06:51 compute-0 nova_compute[187243]:   <uuid>5187b0f8-a8d1-4c99-a0b9-809caf89b88a</uuid>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   <name>instance-00000010</name>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-1298851656</nova:name>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:06:50</nova:creationTime>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:06:51 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:06:51 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:user uuid="d7f72082c96e4f868d5b158a57237cee">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin</nova:user>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:project uuid="869170c9b0864bd8a0f2258e90e55a84">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579</nova:project>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         <nova:port uuid="ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0">
Dec 03 00:06:51 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <system>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <entry name="serial">5187b0f8-a8d1-4c99-a0b9-809caf89b88a</entry>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <entry name="uuid">5187b0f8-a8d1-4c99-a0b9-809caf89b88a</entry>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     </system>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   <os>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   </os>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   <features>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   </features>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk.config"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:7d:13:00"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <target dev="tapebecba8e-a0"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/console.log" append="off"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <video>
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     </video>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:06:51 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:06:51 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:06:51 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:06:51 compute-0 nova_compute[187243]: </domain>
Dec 03 00:06:51 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.274 187247 DEBUG nova.compute.manager [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Preparing to wait for external event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.274 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.274 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.274 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.275 187247 DEBUG nova.virt.libvirt.vif [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:06:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1298851656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-129',id=16,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-0lswcfs9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:06:45Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=5187b0f8-a8d1-4c99-a0b9-809caf89b88a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.275 187247 DEBUG nova.network.os_vif_util [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converting VIF {"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.276 187247 DEBUG nova.network.os_vif_util [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.276 187247 DEBUG os_vif [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.277 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.277 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.278 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.278 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.278 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '11d1877e-6ca7-5dfb-a517-1864e0abc9eb', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.279 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.281 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.283 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.283 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebecba8e-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.284 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapebecba8e-a0, col_values=(('qos', UUID('9b777392-e409-4575-be58-905f72d108ca')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.284 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapebecba8e-a0, col_values=(('external_ids', {'iface-id': 'ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:13:00', 'vm-uuid': '5187b0f8-a8d1-4c99-a0b9-809caf89b88a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.285 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:51 compute-0 NetworkManager[55671]: <info>  [1764720411.2867] manager: (tapebecba8e-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.287 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.292 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:51 compute-0 nova_compute[187243]: 2025-12-03 00:06:51.293 187247 INFO os_vif [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0')
Dec 03 00:06:52 compute-0 nova_compute[187243]: 2025-12-03 00:06:52.708 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:52 compute-0 nova_compute[187243]: 2025-12-03 00:06:52.832 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:06:52 compute-0 nova_compute[187243]: 2025-12-03 00:06:52.833 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:06:52 compute-0 nova_compute[187243]: 2025-12-03 00:06:52.833 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] No VIF found with MAC fa:16:3e:7d:13:00, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:06:52 compute-0 nova_compute[187243]: 2025-12-03 00:06:52.833 187247 INFO nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Using config drive
Dec 03 00:06:53 compute-0 nova_compute[187243]: 2025-12-03 00:06:53.351 187247 WARNING neutronclient.v2_0.client [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:06:53 compute-0 nova_compute[187243]: 2025-12-03 00:06:53.448 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:06:53 compute-0 nova_compute[187243]: 2025-12-03 00:06:53.574 187247 INFO nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Creating config drive at /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk.config
Dec 03 00:06:53 compute-0 nova_compute[187243]: 2025-12-03 00:06:53.579 187247 DEBUG oslo_concurrency.processutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp7wlg94n0 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:06:53 compute-0 nova_compute[187243]: 2025-12-03 00:06:53.700 187247 DEBUG oslo_concurrency.processutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp7wlg94n0" returned: 0 in 0.121s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:06:53 compute-0 kernel: tapebecba8e-a0: entered promiscuous mode
Dec 03 00:06:53 compute-0 NetworkManager[55671]: <info>  [1764720413.7531] manager: (tapebecba8e-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Dec 03 00:06:53 compute-0 ovn_controller[95488]: 2025-12-03T00:06:53Z|00121|binding|INFO|Claiming lport ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 for this chassis.
Dec 03 00:06:53 compute-0 ovn_controller[95488]: 2025-12-03T00:06:53Z|00122|binding|INFO|ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0: Claiming fa:16:3e:7d:13:00 10.100.0.14
Dec 03 00:06:53 compute-0 nova_compute[187243]: 2025-12-03 00:06:53.753 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:53 compute-0 nova_compute[187243]: 2025-12-03 00:06:53.757 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.771 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:13:00 10.100.0.14'], port_security=['fa:16:3e:7d:13:00 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5187b0f8-a8d1-4c99-a0b9-809caf89b88a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '21025524-a834-4687-a5db-4097a3a2991d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.772 104379 INFO neutron.agent.ovn.metadata.agent [-] Port ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 bound to our chassis
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.774 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:06:53 compute-0 systemd-udevd[215526]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.791 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0385dd13-4aae-47fa-b411-fc8f78a5f988]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.792 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9c6ad8f4-61 in ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.793 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9c6ad8f4-60 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.793 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[317c2eea-dcb2-4f35-9cdf-0b491b251bfb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.794 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[950d743e-8532-47f7-b009-cae08ab3e8c7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:53 compute-0 NetworkManager[55671]: <info>  [1764720413.8015] device (tapebecba8e-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:06:53 compute-0 NetworkManager[55671]: <info>  [1764720413.8032] device (tapebecba8e-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.811 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[a2724906-fb1c-4108-91d7-da8c33fe9b64]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:53 compute-0 nova_compute[187243]: 2025-12-03 00:06:53.813 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:53 compute-0 ovn_controller[95488]: 2025-12-03T00:06:53Z|00123|binding|INFO|Setting lport ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 ovn-installed in OVS
Dec 03 00:06:53 compute-0 ovn_controller[95488]: 2025-12-03T00:06:53Z|00124|binding|INFO|Setting lport ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 up in Southbound
Dec 03 00:06:53 compute-0 nova_compute[187243]: 2025-12-03 00:06:53.820 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:53 compute-0 systemd-machined[153518]: New machine qemu-10-instance-00000010.
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.828 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0369530a-43ce-4391-a7a6-407a5ec99645]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:53 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-00000010.
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.854 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[7d9dfa4a-aa84-4c96-b63b-f17b964b07cf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.858 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3a01c0d4-403a-41ba-9bef-1a1973455370]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:53 compute-0 NetworkManager[55671]: <info>  [1764720413.8593] manager: (tap9c6ad8f4-60): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.888 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[32d8592c-6ca2-4c1d-80e1-f32f5ab412a6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.891 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[9119c925-a6ae-449f-97d1-25824516a05f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:53 compute-0 NetworkManager[55671]: <info>  [1764720413.9127] device (tap9c6ad8f4-60): carrier: link connected
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.918 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[063388e7-6187-4227-b6a5-9522519e97e2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.933 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[eb79349a-275e-41ef-a6e0-fad6e2f1e20c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c6ad8f4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:f8:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450455, 'reachable_time': 18918, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215562, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.945 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[77de213e-8a14-4364-b91f-b398534a421b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:f806'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450455, 'tstamp': 450455}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215563, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.960 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[94e3b764-3941-40d8-a45b-d9788d9b11c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c6ad8f4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:f8:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450455, 'reachable_time': 18918, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215564, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:53.989 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[bcf15b45-af1c-4c6f-ac98-e9028eba59e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.039 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f973a6e4-dd14-4e2c-bf1b-2e70d27f6a3e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.040 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c6ad8f4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.040 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.040 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c6ad8f4-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.041 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:54 compute-0 kernel: tap9c6ad8f4-60: entered promiscuous mode
Dec 03 00:06:54 compute-0 NetworkManager[55671]: <info>  [1764720414.0425] manager: (tap9c6ad8f4-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.043 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.044 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c6ad8f4-60, col_values=(('external_ids', {'iface-id': 'df9da247-f3c2-412c-95a4-9a2562c93dd4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.045 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:54 compute-0 ovn_controller[95488]: 2025-12-03T00:06:54Z|00125|binding|INFO|Releasing lport df9da247-f3c2-412c-95a4-9a2562c93dd4 from this chassis (sb_readonly=0)
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.046 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.048 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[beb458e7-f931-4c6b-9437-5585d8502f29]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.048 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.048 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.048 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.049 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.049 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[56b12b8c-0dd4-418f-9c5f-af9c04621dc0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.049 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.049 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fdc2a0d0-b073-4dce-bb4f-cd3b35eba6ee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.050 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: global
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: defaults
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     log global
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:06:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:06:54.050 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'env', 'PROCESS_TAG=haproxy-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.058 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.414 187247 DEBUG nova.compute.manager [req-4d852656-b8af-444b-8274-288ddae9e804 req-7b5ff740-b885-4e0c-9048-23cada757ddc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.415 187247 DEBUG oslo_concurrency.lockutils [req-4d852656-b8af-444b-8274-288ddae9e804 req-7b5ff740-b885-4e0c-9048-23cada757ddc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.415 187247 DEBUG oslo_concurrency.lockutils [req-4d852656-b8af-444b-8274-288ddae9e804 req-7b5ff740-b885-4e0c-9048-23cada757ddc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.416 187247 DEBUG oslo_concurrency.lockutils [req-4d852656-b8af-444b-8274-288ddae9e804 req-7b5ff740-b885-4e0c-9048-23cada757ddc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.416 187247 DEBUG nova.compute.manager [req-4d852656-b8af-444b-8274-288ddae9e804 req-7b5ff740-b885-4e0c-9048-23cada757ddc 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Processing event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.417 187247 DEBUG nova.compute.manager [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.422 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.427 187247 INFO nova.virt.libvirt.driver [-] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Instance spawned successfully.
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.428 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:06:54 compute-0 podman[215603]: 2025-12-03 00:06:54.514281403 +0000 UTC m=+0.096636972 container create cdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 00:06:54 compute-0 podman[215603]: 2025-12-03 00:06:54.466115391 +0000 UTC m=+0.048471020 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:06:54 compute-0 systemd[1]: Started libpod-conmon-cdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65.scope.
Dec 03 00:06:54 compute-0 systemd[1]: Started libcrun container.
Dec 03 00:06:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f8692124caeeed2a34490509358303fdab42753d7a18e2f461c12b97393a524/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:06:54 compute-0 podman[215603]: 2025-12-03 00:06:54.624162467 +0000 UTC m=+0.206518036 container init cdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Dec 03 00:06:54 compute-0 podman[215603]: 2025-12-03 00:06:54.632622893 +0000 UTC m=+0.214978432 container start cdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 03 00:06:54 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215618]: [NOTICE]   (215622) : New worker (215624) forked
Dec 03 00:06:54 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215618]: [NOTICE]   (215622) : Loading success.
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.944 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.944 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.945 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.945 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.946 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:06:54 compute-0 nova_compute[187243]: 2025-12-03 00:06:54.946 187247 DEBUG nova.virt.libvirt.driver [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:06:55 compute-0 nova_compute[187243]: 2025-12-03 00:06:55.458 187247 INFO nova.compute.manager [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Took 9.03 seconds to spawn the instance on the hypervisor.
Dec 03 00:06:55 compute-0 nova_compute[187243]: 2025-12-03 00:06:55.459 187247 DEBUG nova.compute.manager [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:06:55 compute-0 nova_compute[187243]: 2025-12-03 00:06:55.988 187247 INFO nova.compute.manager [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Took 14.68 seconds to build instance.
Dec 03 00:06:56 compute-0 nova_compute[187243]: 2025-12-03 00:06:56.287 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:56 compute-0 nova_compute[187243]: 2025-12-03 00:06:56.488 187247 DEBUG nova.compute.manager [req-a27ebc2b-eb57-46a2-b866-df55ae62f44a req-62a9ee12-8b7e-4e86-98d9-d14783fd3ac8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:06:56 compute-0 nova_compute[187243]: 2025-12-03 00:06:56.488 187247 DEBUG oslo_concurrency.lockutils [req-a27ebc2b-eb57-46a2-b866-df55ae62f44a req-62a9ee12-8b7e-4e86-98d9-d14783fd3ac8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:06:56 compute-0 nova_compute[187243]: 2025-12-03 00:06:56.489 187247 DEBUG oslo_concurrency.lockutils [req-a27ebc2b-eb57-46a2-b866-df55ae62f44a req-62a9ee12-8b7e-4e86-98d9-d14783fd3ac8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:06:56 compute-0 nova_compute[187243]: 2025-12-03 00:06:56.489 187247 DEBUG oslo_concurrency.lockutils [req-a27ebc2b-eb57-46a2-b866-df55ae62f44a req-62a9ee12-8b7e-4e86-98d9-d14783fd3ac8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:56 compute-0 nova_compute[187243]: 2025-12-03 00:06:56.489 187247 DEBUG nova.compute.manager [req-a27ebc2b-eb57-46a2-b866-df55ae62f44a req-62a9ee12-8b7e-4e86-98d9-d14783fd3ac8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] No waiting events found dispatching network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:06:56 compute-0 nova_compute[187243]: 2025-12-03 00:06:56.489 187247 WARNING nova.compute.manager [req-a27ebc2b-eb57-46a2-b866-df55ae62f44a req-62a9ee12-8b7e-4e86-98d9-d14783fd3ac8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received unexpected event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 for instance with vm_state active and task_state None.
Dec 03 00:06:56 compute-0 nova_compute[187243]: 2025-12-03 00:06:56.493 187247 DEBUG oslo_concurrency.lockutils [None req-43fc9033-037e-4f11-b74d-3154c4045625 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.201s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:06:57 compute-0 nova_compute[187243]: 2025-12-03 00:06:57.709 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:06:59 compute-0 podman[197600]: time="2025-12-03T00:06:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:06:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:06:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:06:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:06:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3058 "" "Go-http-client/1.1"
Dec 03 00:06:59 compute-0 sshd-session[215508]: Received disconnect from 45.78.222.160 port 47984:11: Bye Bye [preauth]
Dec 03 00:06:59 compute-0 sshd-session[215508]: Disconnected from 45.78.222.160 port 47984 [preauth]
Dec 03 00:07:00 compute-0 podman[215633]: 2025-12-03 00:07:00.154334591 +0000 UTC m=+0.092022661 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=edpm, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Dec 03 00:07:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:00.695 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:00.695 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:00.696 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:01 compute-0 nova_compute[187243]: 2025-12-03 00:07:01.291 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:01 compute-0 openstack_network_exporter[199746]: ERROR   00:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:07:01 compute-0 openstack_network_exporter[199746]: ERROR   00:07:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:07:01 compute-0 openstack_network_exporter[199746]: ERROR   00:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:07:01 compute-0 openstack_network_exporter[199746]: ERROR   00:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:07:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:07:01 compute-0 openstack_network_exporter[199746]: ERROR   00:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:07:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:07:02 compute-0 nova_compute[187243]: 2025-12-03 00:07:02.710 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:04 compute-0 podman[215655]: 2025-12-03 00:07:04.098777147 +0000 UTC m=+0.058058474 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_id=multipathd)
Dec 03 00:07:05 compute-0 sshd-session[215656]: Invalid user admin from 80.94.95.116 port 55670
Dec 03 00:07:06 compute-0 sshd-session[215656]: Connection closed by invalid user admin 80.94.95.116 port 55670 [preauth]
Dec 03 00:07:06 compute-0 ovn_controller[95488]: 2025-12-03T00:07:06Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7d:13:00 10.100.0.14
Dec 03 00:07:06 compute-0 ovn_controller[95488]: 2025-12-03T00:07:06Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7d:13:00 10.100.0.14
Dec 03 00:07:06 compute-0 nova_compute[187243]: 2025-12-03 00:07:06.296 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:07 compute-0 nova_compute[187243]: 2025-12-03 00:07:07.713 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:11 compute-0 nova_compute[187243]: 2025-12-03 00:07:11.300 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:12 compute-0 nova_compute[187243]: 2025-12-03 00:07:12.729 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:14 compute-0 podman[215690]: 2025-12-03 00:07:14.103883207 +0000 UTC m=+0.058190277 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:07:16 compute-0 nova_compute[187243]: 2025-12-03 00:07:16.303 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:17 compute-0 podman[215714]: 2025-12-03 00:07:17.093520778 +0000 UTC m=+0.050820478 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec 03 00:07:17 compute-0 podman[215715]: 2025-12-03 00:07:17.157421693 +0000 UTC m=+0.106704758 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20251202, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 03 00:07:17 compute-0 nova_compute[187243]: 2025-12-03 00:07:17.731 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:21 compute-0 nova_compute[187243]: 2025-12-03 00:07:21.305 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:22 compute-0 nova_compute[187243]: 2025-12-03 00:07:22.733 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:23 compute-0 nova_compute[187243]: 2025-12-03 00:07:23.592 187247 DEBUG nova.compute.manager [None req-3d1247ad-3c1c-4247-8ef6-a10e636233fa 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Dec 03 00:07:23 compute-0 nova_compute[187243]: 2025-12-03 00:07:23.647 187247 DEBUG nova.compute.provider_tree [None req-3d1247ad-3c1c-4247-8ef6-a10e636233fa 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Updating resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 generation from 27 to 28 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 03 00:07:26 compute-0 nova_compute[187243]: 2025-12-03 00:07:26.308 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:27 compute-0 nova_compute[187243]: 2025-12-03 00:07:27.736 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:29 compute-0 podman[197600]: time="2025-12-03T00:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:07:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:07:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3062 "" "Go-http-client/1.1"
Dec 03 00:07:30 compute-0 sshd-session[215758]: Received disconnect from 20.123.120.169 port 59416:11: Bye Bye [preauth]
Dec 03 00:07:30 compute-0 sshd-session[215758]: Disconnected from authenticating user root 20.123.120.169 port 59416 [preauth]
Dec 03 00:07:30 compute-0 nova_compute[187243]: 2025-12-03 00:07:30.958 187247 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Check if temp file /var/lib/nova/instances/tmp6wlbjtie exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:07:30 compute-0 nova_compute[187243]: 2025-12-03 00:07:30.964 187247 DEBUG nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6wlbjtie',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5187b0f8-a8d1-4c99-a0b9-809caf89b88a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:07:31 compute-0 podman[215760]: 2025-12-03 00:07:31.136449415 +0000 UTC m=+0.084838775 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Dec 03 00:07:31 compute-0 nova_compute[187243]: 2025-12-03 00:07:31.312 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:31 compute-0 openstack_network_exporter[199746]: ERROR   00:07:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:07:31 compute-0 openstack_network_exporter[199746]: ERROR   00:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:07:31 compute-0 openstack_network_exporter[199746]: ERROR   00:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:07:31 compute-0 openstack_network_exporter[199746]: ERROR   00:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:07:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:07:31 compute-0 openstack_network_exporter[199746]: ERROR   00:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:07:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:07:32 compute-0 nova_compute[187243]: 2025-12-03 00:07:32.781 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:35 compute-0 podman[215781]: 2025-12-03 00:07:35.126544241 +0000 UTC m=+0.075634282 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4)
Dec 03 00:07:36 compute-0 nova_compute[187243]: 2025-12-03 00:07:36.245 187247 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:36 compute-0 nova_compute[187243]: 2025-12-03 00:07:36.299 187247 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:36 compute-0 nova_compute[187243]: 2025-12-03 00:07:36.300 187247 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:36 compute-0 nova_compute[187243]: 2025-12-03 00:07:36.316 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:36 compute-0 nova_compute[187243]: 2025-12-03 00:07:36.354 187247 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:36 compute-0 nova_compute[187243]: 2025-12-03 00:07:36.356 187247 DEBUG nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Preparing to wait for external event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:07:36 compute-0 nova_compute[187243]: 2025-12-03 00:07:36.356 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:36 compute-0 nova_compute[187243]: 2025-12-03 00:07:36.357 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:36 compute-0 nova_compute[187243]: 2025-12-03 00:07:36.357 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:36 compute-0 sshd-session[215803]: Received disconnect from 49.247.36.49 port 25199:11: Bye Bye [preauth]
Dec 03 00:07:36 compute-0 sshd-session[215803]: Disconnected from authenticating user root 49.247.36.49 port 25199 [preauth]
Dec 03 00:07:36 compute-0 nova_compute[187243]: 2025-12-03 00:07:36.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:37 compute-0 nova_compute[187243]: 2025-12-03 00:07:37.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:37 compute-0 nova_compute[187243]: 2025-12-03 00:07:37.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:37 compute-0 nova_compute[187243]: 2025-12-03 00:07:37.593 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:07:37 compute-0 nova_compute[187243]: 2025-12-03 00:07:37.838 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:41 compute-0 nova_compute[187243]: 2025-12-03 00:07:41.320 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:41 compute-0 nova_compute[187243]: 2025-12-03 00:07:41.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:41 compute-0 nova_compute[187243]: 2025-12-03 00:07:41.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:41 compute-0 nova_compute[187243]: 2025-12-03 00:07:41.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:07:41 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:41.781 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:07:41 compute-0 nova_compute[187243]: 2025-12-03 00:07:41.781 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:41 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:41.782 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:07:41 compute-0 nova_compute[187243]: 2025-12-03 00:07:41.794 187247 DEBUG nova.compute.manager [req-a7258dca-6943-4beb-b349-4e02022ae4de req-df2ed222-c72c-44d9-bf33-80cfab088f7f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:07:41 compute-0 nova_compute[187243]: 2025-12-03 00:07:41.794 187247 DEBUG oslo_concurrency.lockutils [req-a7258dca-6943-4beb-b349-4e02022ae4de req-df2ed222-c72c-44d9-bf33-80cfab088f7f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:41 compute-0 nova_compute[187243]: 2025-12-03 00:07:41.794 187247 DEBUG oslo_concurrency.lockutils [req-a7258dca-6943-4beb-b349-4e02022ae4de req-df2ed222-c72c-44d9-bf33-80cfab088f7f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:41 compute-0 nova_compute[187243]: 2025-12-03 00:07:41.795 187247 DEBUG oslo_concurrency.lockutils [req-a7258dca-6943-4beb-b349-4e02022ae4de req-df2ed222-c72c-44d9-bf33-80cfab088f7f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:41 compute-0 nova_compute[187243]: 2025-12-03 00:07:41.795 187247 DEBUG nova.compute.manager [req-a7258dca-6943-4beb-b349-4e02022ae4de req-df2ed222-c72c-44d9-bf33-80cfab088f7f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] No event matching network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 in dict_keys([('network-vif-plugged', 'ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:07:41 compute-0 nova_compute[187243]: 2025-12-03 00:07:41.795 187247 DEBUG nova.compute.manager [req-a7258dca-6943-4beb-b349-4e02022ae4de req-df2ed222-c72c-44d9-bf33-80cfab088f7f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:07:42 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:42.784 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:42 compute-0 nova_compute[187243]: 2025-12-03 00:07:42.840 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:42 compute-0 nova_compute[187243]: 2025-12-03 00:07:42.875 187247 INFO nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Took 6.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 03 00:07:42 compute-0 ovn_controller[95488]: 2025-12-03T00:07:42Z|00126|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.101 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.886 187247 DEBUG nova.compute.manager [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.886 187247 DEBUG oslo_concurrency.lockutils [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.887 187247 DEBUG oslo_concurrency.lockutils [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.887 187247 DEBUG oslo_concurrency.lockutils [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.887 187247 DEBUG nova.compute.manager [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Processing event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.887 187247 DEBUG nova.compute.manager [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-changed-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.887 187247 DEBUG nova.compute.manager [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Refreshing instance network info cache due to event network-changed-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.888 187247 DEBUG oslo_concurrency.lockutils [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.888 187247 DEBUG oslo_concurrency.lockutils [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.888 187247 DEBUG nova.network.neutron [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Refreshing network info cache for port ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:07:43 compute-0 nova_compute[187243]: 2025-12-03 00:07:43.889 187247 DEBUG nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:07:44 compute-0 nova_compute[187243]: 2025-12-03 00:07:44.106 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:44 compute-0 nova_compute[187243]: 2025-12-03 00:07:44.106 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:44 compute-0 nova_compute[187243]: 2025-12-03 00:07:44.106 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:44 compute-0 nova_compute[187243]: 2025-12-03 00:07:44.106 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:07:44 compute-0 podman[215815]: 2025-12-03 00:07:44.193766747 +0000 UTC m=+0.051469153 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:07:44 compute-0 nova_compute[187243]: 2025-12-03 00:07:44.394 187247 WARNING neutronclient.v2_0.client [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:44 compute-0 nova_compute[187243]: 2025-12-03 00:07:44.399 187247 DEBUG nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6wlbjtie',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5187b0f8-a8d1-4c99-a0b9-809caf89b88a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(9e7124fa-e997-4c19-b812-98c74391064a),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:07:44 compute-0 nova_compute[187243]: 2025-12-03 00:07:44.915 187247 DEBUG nova.objects.instance [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 5187b0f8-a8d1-4c99-a0b9-809caf89b88a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:07:44 compute-0 nova_compute[187243]: 2025-12-03 00:07:44.918 187247 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:07:44 compute-0 nova_compute[187243]: 2025-12-03 00:07:44.920 187247 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:07:44 compute-0 nova_compute[187243]: 2025-12-03 00:07:44.921 187247 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.148 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.197 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.200 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.294 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.424 187247 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.425 187247 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.435 187247 DEBUG nova.virt.libvirt.vif [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:06:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1298851656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-129',id=16,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:06:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-0lswcfs9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:06:55Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=5187b0f8-a8d1-4c99-a0b9-809caf89b88a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.436 187247 DEBUG nova.network.os_vif_util [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.437 187247 DEBUG nova.network.os_vif_util [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.438 187247 DEBUG nova.virt.libvirt.migration [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:7d:13:00"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <target dev="tapebecba8e-a0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]: </interface>
Dec 03 00:07:45 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.439 187247 DEBUG nova.virt.libvirt.migration [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <name>instance-00000010</name>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <uuid>5187b0f8-a8d1-4c99-a0b9-809caf89b88a</uuid>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-1298851656</nova:name>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:06:50</nova:creationTime>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:07:45 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:07:45 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:user uuid="d7f72082c96e4f868d5b158a57237cee">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin</nova:user>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:project uuid="869170c9b0864bd8a0f2258e90e55a84">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579</nova:project>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:port uuid="ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0">
Dec 03 00:07:45 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <system>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="serial">5187b0f8-a8d1-4c99-a0b9-809caf89b88a</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="uuid">5187b0f8-a8d1-4c99-a0b9-809caf89b88a</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </system>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <os>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </os>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <features>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </features>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk.config"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:7d:13:00"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapebecba8e-a0"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/console.log" append="off"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </target>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/console.log" append="off"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </console>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </input>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <video>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </video>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]: </domain>
Dec 03 00:07:45 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.441 187247 DEBUG nova.virt.libvirt.migration [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <name>instance-00000010</name>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <uuid>5187b0f8-a8d1-4c99-a0b9-809caf89b88a</uuid>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-1298851656</nova:name>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:06:50</nova:creationTime>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:07:45 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:07:45 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:user uuid="d7f72082c96e4f868d5b158a57237cee">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin</nova:user>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:project uuid="869170c9b0864bd8a0f2258e90e55a84">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579</nova:project>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:port uuid="ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0">
Dec 03 00:07:45 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <system>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="serial">5187b0f8-a8d1-4c99-a0b9-809caf89b88a</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="uuid">5187b0f8-a8d1-4c99-a0b9-809caf89b88a</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </system>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <os>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </os>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <features>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </features>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk.config"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:7d:13:00"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapebecba8e-a0"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/console.log" append="off"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </target>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/console.log" append="off"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </console>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </input>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <video>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </video>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]: </domain>
Dec 03 00:07:45 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.441 187247 DEBUG nova.virt.libvirt.migration [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <name>instance-00000010</name>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <uuid>5187b0f8-a8d1-4c99-a0b9-809caf89b88a</uuid>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-1298851656</nova:name>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:06:50</nova:creationTime>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:07:45 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:07:45 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:user uuid="d7f72082c96e4f868d5b158a57237cee">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin</nova:user>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:project uuid="869170c9b0864bd8a0f2258e90e55a84">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579</nova:project>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <nova:port uuid="ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0">
Dec 03 00:07:45 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <system>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="serial">5187b0f8-a8d1-4c99-a0b9-809caf89b88a</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="uuid">5187b0f8-a8d1-4c99-a0b9-809caf89b88a</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </system>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <os>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </os>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <features>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </features>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/disk.config"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:7d:13:00"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapebecba8e-a0"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/console.log" append="off"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:07:45 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       </target>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a/console.log" append="off"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </console>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </input>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <video>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </video>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:07:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:07:45 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:07:45 compute-0 nova_compute[187243]: </domain>
Dec 03 00:07:45 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.442 187247 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.522 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.524 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.543 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.543 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5686MB free_disk=73.1360092163086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.543 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.543 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.928 187247 DEBUG nova.virt.libvirt.migration [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:07:45 compute-0 nova_compute[187243]: 2025-12-03 00:07:45.929 187247 INFO nova.virt.libvirt.migration [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:07:46 compute-0 nova_compute[187243]: 2025-12-03 00:07:46.325 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:46 compute-0 nova_compute[187243]: 2025-12-03 00:07:46.567 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Updating resource usage from migration 9e7124fa-e997-4c19-b812-98c74391064a
Dec 03 00:07:46 compute-0 nova_compute[187243]: 2025-12-03 00:07:46.591 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration 9e7124fa-e997-4c19-b812-98c74391064a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:07:46 compute-0 nova_compute[187243]: 2025-12-03 00:07:46.592 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:07:46 compute-0 nova_compute[187243]: 2025-12-03 00:07:46.592 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:07:45 up  1:15,  0 user,  load average: 0.42, 0.27, 0.33\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_869170c9b0864bd8a0f2258e90e55a84': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:07:46 compute-0 nova_compute[187243]: 2025-12-03 00:07:46.632 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:07:46 compute-0 nova_compute[187243]: 2025-12-03 00:07:46.802 187247 WARNING neutronclient.v2_0.client [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:46 compute-0 nova_compute[187243]: 2025-12-03 00:07:46.952 187247 DEBUG nova.network.neutron [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Updated VIF entry in instance network info cache for port ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:07:46 compute-0 nova_compute[187243]: 2025-12-03 00:07:46.952 187247 DEBUG nova.network.neutron [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Updating instance_info_cache with network_info: [{"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:07:46 compute-0 nova_compute[187243]: 2025-12-03 00:07:46.955 187247 INFO nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:07:47 compute-0 nova_compute[187243]: 2025-12-03 00:07:47.139 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:07:47 compute-0 nova_compute[187243]: 2025-12-03 00:07:47.460 187247 DEBUG nova.virt.libvirt.migration [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:07:47 compute-0 nova_compute[187243]: 2025-12-03 00:07:47.461 187247 DEBUG nova.virt.libvirt.migration [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 03 00:07:47 compute-0 nova_compute[187243]: 2025-12-03 00:07:47.463 187247 DEBUG oslo_concurrency.lockutils [req-454f646e-e52d-4804-aa37-f9797142aab5 req-1af448bb-1acd-46f9-a13c-61969eac9d25 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-5187b0f8-a8d1-4c99-a0b9-809caf89b88a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:07:47 compute-0 nova_compute[187243]: 2025-12-03 00:07:47.662 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:07:47 compute-0 nova_compute[187243]: 2025-12-03 00:07:47.662 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.119s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:47 compute-0 nova_compute[187243]: 2025-12-03 00:07:47.663 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:47 compute-0 kernel: tapebecba8e-a0 (unregistering): left promiscuous mode
Dec 03 00:07:47 compute-0 NetworkManager[55671]: <info>  [1764720467.8343] device (tapebecba8e-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:07:47 compute-0 nova_compute[187243]: 2025-12-03 00:07:47.892 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:47 compute-0 ovn_controller[95488]: 2025-12-03T00:07:47Z|00127|binding|INFO|Releasing lport ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 from this chassis (sb_readonly=0)
Dec 03 00:07:47 compute-0 ovn_controller[95488]: 2025-12-03T00:07:47Z|00128|binding|INFO|Setting lport ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 down in Southbound
Dec 03 00:07:47 compute-0 ovn_controller[95488]: 2025-12-03T00:07:47Z|00129|binding|INFO|Removing iface tapebecba8e-a0 ovn-installed in OVS
Dec 03 00:07:47 compute-0 nova_compute[187243]: 2025-12-03 00:07:47.894 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:47.900 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:13:00 10.100.0.14'], port_security=['fa:16:3e:7d:13:00 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5187b0f8-a8d1-4c99-a0b9-809caf89b88a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '10', 'neutron:security_group_ids': '21025524-a834-4687-a5db-4097a3a2991d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:07:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:47.902 104379 INFO neutron.agent.ovn.metadata.agent [-] Port ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 unbound from our chassis
Dec 03 00:07:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:47.903 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:07:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:47.905 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f1332558-3b70-45da-9387-49f2211dcf8b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:47.906 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 namespace which is not needed anymore
Dec 03 00:07:47 compute-0 nova_compute[187243]: 2025-12-03 00:07:47.913 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:47 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000010.scope: Deactivated successfully.
Dec 03 00:07:47 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000010.scope: Consumed 14.042s CPU time.
Dec 03 00:07:47 compute-0 systemd-machined[153518]: Machine qemu-10-instance-00000010 terminated.
Dec 03 00:07:47 compute-0 podman[215862]: 2025-12-03 00:07:47.999508368 +0000 UTC m=+0.088491474 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 03 00:07:48 compute-0 podman[215864]: 2025-12-03 00:07:48.008883456 +0000 UTC m=+0.097902693 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:07:48 compute-0 podman[215924]: 2025-12-03 00:07:48.035897734 +0000 UTC m=+0.038224881 container kill cdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.035 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:48 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215618]: [NOTICE]   (215622) : haproxy version is 3.0.5-8e879a5
Dec 03 00:07:48 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215618]: [NOTICE]   (215622) : path to executable is /usr/sbin/haproxy
Dec 03 00:07:48 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215618]: [WARNING]  (215622) : Exiting Master process...
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.038 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:48 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215618]: [ALERT]    (215622) : Current worker (215624) exited with code 143 (Terminated)
Dec 03 00:07:48 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[215618]: [WARNING]  (215622) : All workers exited. Exiting... (0)
Dec 03 00:07:48 compute-0 systemd[1]: libpod-cdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65.scope: Deactivated successfully.
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.084 187247 DEBUG nova.virt.libvirt.guest [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.085 187247 INFO nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Migration operation has completed
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.085 187247 INFO nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] _post_live_migration() is started..
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.088 187247 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.089 187247 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.089 187247 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:07:48 compute-0 podman[215949]: 2025-12-03 00:07:48.08873335 +0000 UTC m=+0.030410721 container died cdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.099 187247 DEBUG nova.compute.manager [req-312886ca-6a63-483b-b803-67f6e6f71b91 req-cfd8bfed-039b-4b80-b4d0-743264daac6f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.099 187247 DEBUG oslo_concurrency.lockutils [req-312886ca-6a63-483b-b803-67f6e6f71b91 req-cfd8bfed-039b-4b80-b4d0-743264daac6f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.100 187247 DEBUG oslo_concurrency.lockutils [req-312886ca-6a63-483b-b803-67f6e6f71b91 req-cfd8bfed-039b-4b80-b4d0-743264daac6f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.100 187247 DEBUG oslo_concurrency.lockutils [req-312886ca-6a63-483b-b803-67f6e6f71b91 req-cfd8bfed-039b-4b80-b4d0-743264daac6f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.101 187247 DEBUG nova.compute.manager [req-312886ca-6a63-483b-b803-67f6e6f71b91 req-cfd8bfed-039b-4b80-b4d0-743264daac6f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] No waiting events found dispatching network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.101 187247 DEBUG nova.compute.manager [req-312886ca-6a63-483b-b803-67f6e6f71b91 req-cfd8bfed-039b-4b80-b4d0-743264daac6f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.102 187247 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.102 187247 WARNING neutronclient.v2_0.client [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:07:48 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65-userdata-shm.mount: Deactivated successfully.
Dec 03 00:07:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-4f8692124caeeed2a34490509358303fdab42753d7a18e2f461c12b97393a524-merged.mount: Deactivated successfully.
Dec 03 00:07:48 compute-0 podman[215949]: 2025-12-03 00:07:48.127666767 +0000 UTC m=+0.069344148 container cleanup cdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 03 00:07:48 compute-0 systemd[1]: libpod-conmon-cdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65.scope: Deactivated successfully.
Dec 03 00:07:48 compute-0 podman[215954]: 2025-12-03 00:07:48.144856115 +0000 UTC m=+0.077560608 container remove cdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:07:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:48.150 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[eafaac6e-acd9-4bac-a6fb-1ba44806d416]: (4, ("Wed Dec  3 12:07:47 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 (cdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65)\ncdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65\nWed Dec  3 12:07:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 (cdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65)\ncdb083357627ef61103e829ffbc9f89557e14d0c1f43fb4e0f93652477846b65\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:48.152 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e74247a2-ba62-43a6-aa97-e93eaa46f839]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:48.152 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:07:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:48.152 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[bb71e482-9131-40e8-a339-26ab153d5f9a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:48.153 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c6ad8f4-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.155 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:48 compute-0 kernel: tap9c6ad8f4-60: left promiscuous mode
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.170 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.172 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:48.173 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9cca240f-50d3-47aa-9e1e-fac73ae383d8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:48.195 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[dd3ed281-51b8-42e1-8e81-d24577cc4508]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:48.196 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4eea1f3f-0c7f-4375-8fb9-f38a92d7081d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:48.211 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0fd64551-e29f-4275-a633-0ae55291984b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450449, 'reachable_time': 38001, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215990, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:48.214 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:07:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:07:48.214 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[1910f79a-9af5-4a35-b592-2fa7f10d4f84]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:07:48 compute-0 systemd[1]: run-netns-ovnmeta\x2d9c6ad8f4\x2d62a9\x2d4a0d\x2dac57\x2de980ee855c68.mount: Deactivated successfully.
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.952 187247 DEBUG nova.compute.manager [req-d4fb10a2-9779-4565-a9cf-920d0b7bace2 req-9e954e8a-5f7d-419c-ae1d-7c5a3e4216d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.953 187247 DEBUG oslo_concurrency.lockutils [req-d4fb10a2-9779-4565-a9cf-920d0b7bace2 req-9e954e8a-5f7d-419c-ae1d-7c5a3e4216d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.953 187247 DEBUG oslo_concurrency.lockutils [req-d4fb10a2-9779-4565-a9cf-920d0b7bace2 req-9e954e8a-5f7d-419c-ae1d-7c5a3e4216d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.953 187247 DEBUG oslo_concurrency.lockutils [req-d4fb10a2-9779-4565-a9cf-920d0b7bace2 req-9e954e8a-5f7d-419c-ae1d-7c5a3e4216d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.953 187247 DEBUG nova.compute.manager [req-d4fb10a2-9779-4565-a9cf-920d0b7bace2 req-9e954e8a-5f7d-419c-ae1d-7c5a3e4216d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] No waiting events found dispatching network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:07:48 compute-0 nova_compute[187243]: 2025-12-03 00:07:48.954 187247 DEBUG nova.compute.manager [req-d4fb10a2-9779-4565-a9cf-920d0b7bace2 req-9e954e8a-5f7d-419c-ae1d-7c5a3e4216d0 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.171 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.919 187247 DEBUG nova.network.neutron [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.920 187247 DEBUG nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.921 187247 DEBUG nova.virt.libvirt.vif [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:06:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1298851656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-129',id=16,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:06:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-0lswcfs9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:07:25Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=5187b0f8-a8d1-4c99-a0b9-809caf89b88a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.921 187247 DEBUG nova.network.os_vif_util [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "address": "fa:16:3e:7d:13:00", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebecba8e-a0", "ovs_interfaceid": "ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.922 187247 DEBUG nova.network.os_vif_util [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.922 187247 DEBUG os_vif [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.924 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.925 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebecba8e-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.926 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.928 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.929 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.929 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=9b777392-e409-4575-be58-905f72d108ca) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.929 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.930 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.933 187247 INFO os_vif [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:13:00,bridge_name='br-int',has_traffic_filtering=True,id=ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebecba8e-a0')
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.934 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.934 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.935 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.935 187247 DEBUG nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.936 187247 INFO nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Deleting instance files /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a_del
Dec 03 00:07:49 compute-0 nova_compute[187243]: 2025-12-03 00:07:49.937 187247 INFO nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Deletion of /var/lib/nova/instances/5187b0f8-a8d1-4c99-a0b9-809caf89b88a_del complete
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.186 187247 DEBUG nova.compute.manager [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.186 187247 DEBUG oslo_concurrency.lockutils [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.187 187247 DEBUG oslo_concurrency.lockutils [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.187 187247 DEBUG oslo_concurrency.lockutils [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.187 187247 DEBUG nova.compute.manager [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] No waiting events found dispatching network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.187 187247 WARNING nova.compute.manager [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received unexpected event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 for instance with vm_state active and task_state migrating.
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.188 187247 DEBUG nova.compute.manager [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.188 187247 DEBUG oslo_concurrency.lockutils [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.188 187247 DEBUG oslo_concurrency.lockutils [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.188 187247 DEBUG oslo_concurrency.lockutils [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.188 187247 DEBUG nova.compute.manager [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] No waiting events found dispatching network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.189 187247 DEBUG nova.compute.manager [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-unplugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.189 187247 DEBUG nova.compute.manager [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.189 187247 DEBUG oslo_concurrency.lockutils [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.189 187247 DEBUG oslo_concurrency.lockutils [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.189 187247 DEBUG oslo_concurrency.lockutils [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.190 187247 DEBUG nova.compute.manager [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] No waiting events found dispatching network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:07:50 compute-0 nova_compute[187243]: 2025-12-03 00:07:50.190 187247 WARNING nova.compute.manager [req-9f8df126-ca7c-4f64-9e6c-d1522a81ea4d req-e45270e1-0ae1-4c89-923e-af60ec6a3d3d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received unexpected event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 for instance with vm_state active and task_state migrating.
Dec 03 00:07:51 compute-0 nova_compute[187243]: 2025-12-03 00:07:51.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:07:51 compute-0 nova_compute[187243]: 2025-12-03 00:07:51.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:07:52 compute-0 nova_compute[187243]: 2025-12-03 00:07:52.103 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:07:52 compute-0 nova_compute[187243]: 2025-12-03 00:07:52.269 187247 DEBUG nova.compute.manager [req-8890847f-0384-4461-92b7-acde317c53d6 req-69b29a39-ecd1-44b8-9cb3-8fd5a051c11b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:07:52 compute-0 nova_compute[187243]: 2025-12-03 00:07:52.269 187247 DEBUG oslo_concurrency.lockutils [req-8890847f-0384-4461-92b7-acde317c53d6 req-69b29a39-ecd1-44b8-9cb3-8fd5a051c11b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:52 compute-0 nova_compute[187243]: 2025-12-03 00:07:52.270 187247 DEBUG oslo_concurrency.lockutils [req-8890847f-0384-4461-92b7-acde317c53d6 req-69b29a39-ecd1-44b8-9cb3-8fd5a051c11b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:52 compute-0 nova_compute[187243]: 2025-12-03 00:07:52.270 187247 DEBUG oslo_concurrency.lockutils [req-8890847f-0384-4461-92b7-acde317c53d6 req-69b29a39-ecd1-44b8-9cb3-8fd5a051c11b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:52 compute-0 nova_compute[187243]: 2025-12-03 00:07:52.270 187247 DEBUG nova.compute.manager [req-8890847f-0384-4461-92b7-acde317c53d6 req-69b29a39-ecd1-44b8-9cb3-8fd5a051c11b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] No waiting events found dispatching network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:07:52 compute-0 nova_compute[187243]: 2025-12-03 00:07:52.270 187247 WARNING nova.compute.manager [req-8890847f-0384-4461-92b7-acde317c53d6 req-69b29a39-ecd1-44b8-9cb3-8fd5a051c11b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Received unexpected event network-vif-plugged-ebecba8e-a0dd-49c4-9ec7-e4b5cac018a0 for instance with vm_state active and task_state migrating.
Dec 03 00:07:52 compute-0 nova_compute[187243]: 2025-12-03 00:07:52.915 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:54 compute-0 sshd-session[215992]: Invalid user syncuser from 102.210.148.92 port 37806
Dec 03 00:07:54 compute-0 sshd-session[215992]: Received disconnect from 102.210.148.92 port 37806:11: Bye Bye [preauth]
Dec 03 00:07:54 compute-0 sshd-session[215992]: Disconnected from invalid user syncuser 102.210.148.92 port 37806 [preauth]
Dec 03 00:07:54 compute-0 nova_compute[187243]: 2025-12-03 00:07:54.977 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:57 compute-0 sshd-session[215994]: Invalid user cc from 23.95.37.90 port 51004
Dec 03 00:07:57 compute-0 sshd-session[215994]: Received disconnect from 23.95.37.90 port 51004:11: Bye Bye [preauth]
Dec 03 00:07:57 compute-0 sshd-session[215994]: Disconnected from invalid user cc 23.95.37.90 port 51004 [preauth]
Dec 03 00:07:57 compute-0 nova_compute[187243]: 2025-12-03 00:07:57.915 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:07:58 compute-0 nova_compute[187243]: 2025-12-03 00:07:58.968 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:58 compute-0 nova_compute[187243]: 2025-12-03 00:07:58.968 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:58 compute-0 nova_compute[187243]: 2025-12-03 00:07:58.969 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "5187b0f8-a8d1-4c99-a0b9-809caf89b88a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:59 compute-0 nova_compute[187243]: 2025-12-03 00:07:59.483 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:59 compute-0 nova_compute[187243]: 2025-12-03 00:07:59.483 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:59 compute-0 nova_compute[187243]: 2025-12-03 00:07:59.483 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:07:59 compute-0 nova_compute[187243]: 2025-12-03 00:07:59.484 187247 DEBUG nova.compute.resource_tracker [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:07:59 compute-0 nova_compute[187243]: 2025-12-03 00:07:59.630 187247 WARNING nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:07:59 compute-0 nova_compute[187243]: 2025-12-03 00:07:59.631 187247 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:07:59 compute-0 nova_compute[187243]: 2025-12-03 00:07:59.653 187247 DEBUG oslo_concurrency.processutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:07:59 compute-0 nova_compute[187243]: 2025-12-03 00:07:59.653 187247 DEBUG nova.compute.resource_tracker [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5852MB free_disk=73.16493225097656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:07:59 compute-0 nova_compute[187243]: 2025-12-03 00:07:59.653 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:07:59 compute-0 nova_compute[187243]: 2025-12-03 00:07:59.654 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:07:59 compute-0 podman[197600]: time="2025-12-03T00:07:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:07:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:07:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:07:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:07:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec 03 00:07:59 compute-0 nova_compute[187243]: 2025-12-03 00:07:59.979 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:00 compute-0 nova_compute[187243]: 2025-12-03 00:08:00.672 187247 DEBUG nova.compute.resource_tracker [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance 5187b0f8-a8d1-4c99-a0b9-809caf89b88a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:08:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:00.697 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:00.697 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:00.697 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:00 compute-0 nova_compute[187243]: 2025-12-03 00:08:00.726 187247 DEBUG nova.compute.manager [None req-d7809ae6-7f69-4b5d-80e8-e3a9c2acad94 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Dec 03 00:08:00 compute-0 nova_compute[187243]: 2025-12-03 00:08:00.790 187247 DEBUG nova.compute.provider_tree [None req-d7809ae6-7f69-4b5d-80e8-e3a9c2acad94 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Updating resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 generation from 28 to 30 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 03 00:08:01 compute-0 nova_compute[187243]: 2025-12-03 00:08:01.179 187247 DEBUG nova.compute.resource_tracker [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:08:01 compute-0 nova_compute[187243]: 2025-12-03 00:08:01.205 187247 DEBUG nova.compute.resource_tracker [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration 9e7124fa-e997-4c19-b812-98c74391064a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:08:01 compute-0 nova_compute[187243]: 2025-12-03 00:08:01.205 187247 DEBUG nova.compute.resource_tracker [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:08:01 compute-0 nova_compute[187243]: 2025-12-03 00:08:01.205 187247 DEBUG nova.compute.resource_tracker [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:07:59 up  1:16,  0 user,  load average: 0.36, 0.26, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:08:01 compute-0 nova_compute[187243]: 2025-12-03 00:08:01.273 187247 DEBUG nova.compute.provider_tree [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:08:01 compute-0 openstack_network_exporter[199746]: ERROR   00:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:08:01 compute-0 openstack_network_exporter[199746]: ERROR   00:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:08:01 compute-0 openstack_network_exporter[199746]: ERROR   00:08:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:08:01 compute-0 openstack_network_exporter[199746]: ERROR   00:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:08:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:08:01 compute-0 openstack_network_exporter[199746]: ERROR   00:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:08:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:08:01 compute-0 nova_compute[187243]: 2025-12-03 00:08:01.779 187247 DEBUG nova.scheduler.client.report [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:08:01 compute-0 sshd[128750]: Timeout before authentication for connection from 101.47.140.127 to 38.102.83.77, pid = 215275
Dec 03 00:08:02 compute-0 podman[215999]: 2025-12-03 00:08:02.103335102 +0000 UTC m=+0.055680647 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 03 00:08:02 compute-0 nova_compute[187243]: 2025-12-03 00:08:02.306 187247 DEBUG nova.compute.resource_tracker [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:08:02 compute-0 nova_compute[187243]: 2025-12-03 00:08:02.306 187247 DEBUG oslo_concurrency.lockutils [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.652s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:02 compute-0 nova_compute[187243]: 2025-12-03 00:08:02.320 187247 INFO nova.compute.manager [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 03 00:08:02 compute-0 nova_compute[187243]: 2025-12-03 00:08:02.959 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:03 compute-0 nova_compute[187243]: 2025-12-03 00:08:03.417 187247 INFO nova.scheduler.client.report [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration 9e7124fa-e997-4c19-b812-98c74391064a
Dec 03 00:08:03 compute-0 nova_compute[187243]: 2025-12-03 00:08:03.417 187247 DEBUG nova.virt.libvirt.driver [None req-83e0c329-3426-4c82-970a-ee60fc2fc629 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 5187b0f8-a8d1-4c99-a0b9-809caf89b88a] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:08:04 compute-0 nova_compute[187243]: 2025-12-03 00:08:04.982 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:06 compute-0 podman[216023]: 2025-12-03 00:08:06.124747024 +0000 UTC m=+0.078940254 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 03 00:08:06 compute-0 sshd-session[216021]: Connection closed by 45.78.218.154 port 45868 [preauth]
Dec 03 00:08:07 compute-0 nova_compute[187243]: 2025-12-03 00:08:07.969 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:09 compute-0 nova_compute[187243]: 2025-12-03 00:08:09.984 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:12 compute-0 nova_compute[187243]: 2025-12-03 00:08:12.971 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:13 compute-0 sshd-session[216044]: Invalid user desliga from 61.220.235.10 port 33272
Dec 03 00:08:13 compute-0 sshd-session[216044]: Received disconnect from 61.220.235.10 port 33272:11: Bye Bye [preauth]
Dec 03 00:08:13 compute-0 sshd-session[216044]: Disconnected from invalid user desliga 61.220.235.10 port 33272 [preauth]
Dec 03 00:08:15 compute-0 nova_compute[187243]: 2025-12-03 00:08:15.023 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:15 compute-0 podman[216046]: 2025-12-03 00:08:15.111853951 +0000 UTC m=+0.060816682 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:08:18 compute-0 nova_compute[187243]: 2025-12-03 00:08:18.022 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:18 compute-0 podman[216070]: 2025-12-03 00:08:18.095530735 +0000 UTC m=+0.053079104 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest)
Dec 03 00:08:18 compute-0 podman[216071]: 2025-12-03 00:08:18.161734378 +0000 UTC m=+0.112112152 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Dec 03 00:08:20 compute-0 nova_compute[187243]: 2025-12-03 00:08:20.025 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:21 compute-0 nova_compute[187243]: 2025-12-03 00:08:21.088 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:21 compute-0 nova_compute[187243]: 2025-12-03 00:08:21.088 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:21 compute-0 nova_compute[187243]: 2025-12-03 00:08:21.594 187247 DEBUG nova.compute.manager [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:08:22 compute-0 nova_compute[187243]: 2025-12-03 00:08:22.138 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:22 compute-0 nova_compute[187243]: 2025-12-03 00:08:22.139 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:22 compute-0 nova_compute[187243]: 2025-12-03 00:08:22.145 187247 DEBUG nova.virt.hardware [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:08:22 compute-0 nova_compute[187243]: 2025-12-03 00:08:22.145 187247 INFO nova.compute.claims [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:08:23 compute-0 nova_compute[187243]: 2025-12-03 00:08:23.023 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:23 compute-0 nova_compute[187243]: 2025-12-03 00:08:23.194 187247 DEBUG nova.compute.provider_tree [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:08:23 compute-0 nova_compute[187243]: 2025-12-03 00:08:23.701 187247 DEBUG nova.scheduler.client.report [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:08:23 compute-0 sshd[128750]: drop connection #0 from [101.47.140.127]:38214 on [38.102.83.77]:22 penalty: exceeded LoginGraceTime
Dec 03 00:08:24 compute-0 nova_compute[187243]: 2025-12-03 00:08:24.212 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.073s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:24 compute-0 nova_compute[187243]: 2025-12-03 00:08:24.213 187247 DEBUG nova.compute.manager [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:08:24 compute-0 nova_compute[187243]: 2025-12-03 00:08:24.725 187247 DEBUG nova.compute.manager [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:08:24 compute-0 nova_compute[187243]: 2025-12-03 00:08:24.726 187247 DEBUG nova.network.neutron [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:08:24 compute-0 nova_compute[187243]: 2025-12-03 00:08:24.726 187247 WARNING neutronclient.v2_0.client [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:24 compute-0 nova_compute[187243]: 2025-12-03 00:08:24.726 187247 WARNING neutronclient.v2_0.client [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:25 compute-0 nova_compute[187243]: 2025-12-03 00:08:25.028 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:25 compute-0 nova_compute[187243]: 2025-12-03 00:08:25.233 187247 INFO nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:08:25 compute-0 nova_compute[187243]: 2025-12-03 00:08:25.750 187247 DEBUG nova.compute.manager [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.708 187247 DEBUG nova.network.neutron [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Successfully created port: 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.772 187247 DEBUG nova.compute.manager [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.774 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.774 187247 INFO nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Creating image(s)
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.775 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.775 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.776 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.776 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.779 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.782 187247 DEBUG oslo_concurrency.processutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.839 187247 DEBUG oslo_concurrency.processutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.842 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.843 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.844 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.849 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.849 187247 DEBUG oslo_concurrency.processutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.938 187247 DEBUG oslo_concurrency.processutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.940 187247 DEBUG oslo_concurrency.processutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.986 187247 DEBUG oslo_concurrency.processutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.988 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:26 compute-0 nova_compute[187243]: 2025-12-03 00:08:26.989 187247 DEBUG oslo_concurrency.processutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:27 compute-0 nova_compute[187243]: 2025-12-03 00:08:27.047 187247 DEBUG oslo_concurrency.processutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:27 compute-0 nova_compute[187243]: 2025-12-03 00:08:27.049 187247 DEBUG nova.virt.disk.api [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Checking if we can resize image /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:08:27 compute-0 nova_compute[187243]: 2025-12-03 00:08:27.049 187247 DEBUG oslo_concurrency.processutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:27 compute-0 nova_compute[187243]: 2025-12-03 00:08:27.142 187247 DEBUG oslo_concurrency.processutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:27 compute-0 nova_compute[187243]: 2025-12-03 00:08:27.144 187247 DEBUG nova.virt.disk.api [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Cannot resize image /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:08:27 compute-0 nova_compute[187243]: 2025-12-03 00:08:27.145 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:08:27 compute-0 nova_compute[187243]: 2025-12-03 00:08:27.146 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Ensure instance console log exists: /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:08:27 compute-0 nova_compute[187243]: 2025-12-03 00:08:27.147 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:27 compute-0 nova_compute[187243]: 2025-12-03 00:08:27.148 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:27 compute-0 nova_compute[187243]: 2025-12-03 00:08:27.148 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:28 compute-0 nova_compute[187243]: 2025-12-03 00:08:28.026 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:28 compute-0 nova_compute[187243]: 2025-12-03 00:08:28.214 187247 DEBUG nova.network.neutron [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Successfully updated port: 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:08:28 compute-0 nova_compute[187243]: 2025-12-03 00:08:28.279 187247 DEBUG nova.compute.manager [req-81a15662-9f95-4557-9b8d-ae9565adf6df req-ea22e754-391a-4b6a-b08c-8bb6fbda4d21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-changed-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:08:28 compute-0 nova_compute[187243]: 2025-12-03 00:08:28.279 187247 DEBUG nova.compute.manager [req-81a15662-9f95-4557-9b8d-ae9565adf6df req-ea22e754-391a-4b6a-b08c-8bb6fbda4d21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Refreshing instance network info cache due to event network-changed-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:08:28 compute-0 nova_compute[187243]: 2025-12-03 00:08:28.279 187247 DEBUG oslo_concurrency.lockutils [req-81a15662-9f95-4557-9b8d-ae9565adf6df req-ea22e754-391a-4b6a-b08c-8bb6fbda4d21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:08:28 compute-0 nova_compute[187243]: 2025-12-03 00:08:28.279 187247 DEBUG oslo_concurrency.lockutils [req-81a15662-9f95-4557-9b8d-ae9565adf6df req-ea22e754-391a-4b6a-b08c-8bb6fbda4d21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:08:28 compute-0 nova_compute[187243]: 2025-12-03 00:08:28.280 187247 DEBUG nova.network.neutron [req-81a15662-9f95-4557-9b8d-ae9565adf6df req-ea22e754-391a-4b6a-b08c-8bb6fbda4d21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Refreshing network info cache for port 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:08:28 compute-0 nova_compute[187243]: 2025-12-03 00:08:28.725 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:08:28 compute-0 nova_compute[187243]: 2025-12-03 00:08:28.786 187247 WARNING neutronclient.v2_0.client [req-81a15662-9f95-4557-9b8d-ae9565adf6df req-ea22e754-391a-4b6a-b08c-8bb6fbda4d21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:29 compute-0 nova_compute[187243]: 2025-12-03 00:08:29.520 187247 DEBUG nova.network.neutron [req-81a15662-9f95-4557-9b8d-ae9565adf6df req-ea22e754-391a-4b6a-b08c-8bb6fbda4d21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:08:29 compute-0 nova_compute[187243]: 2025-12-03 00:08:29.678 187247 DEBUG nova.network.neutron [req-81a15662-9f95-4557-9b8d-ae9565adf6df req-ea22e754-391a-4b6a-b08c-8bb6fbda4d21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:08:29 compute-0 podman[197600]: time="2025-12-03T00:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:08:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:08:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec 03 00:08:30 compute-0 nova_compute[187243]: 2025-12-03 00:08:30.031 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:30 compute-0 nova_compute[187243]: 2025-12-03 00:08:30.192 187247 DEBUG oslo_concurrency.lockutils [req-81a15662-9f95-4557-9b8d-ae9565adf6df req-ea22e754-391a-4b6a-b08c-8bb6fbda4d21 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:08:30 compute-0 nova_compute[187243]: 2025-12-03 00:08:30.193 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquired lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:08:30 compute-0 nova_compute[187243]: 2025-12-03 00:08:30.194 187247 DEBUG nova.network.neutron [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:08:30 compute-0 nova_compute[187243]: 2025-12-03 00:08:30.759 187247 DEBUG nova.network.neutron [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.024 187247 WARNING neutronclient.v2_0.client [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.229 187247 DEBUG nova.network.neutron [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Updating instance_info_cache with network_info: [{"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:08:31 compute-0 openstack_network_exporter[199746]: ERROR   00:08:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:08:31 compute-0 openstack_network_exporter[199746]: ERROR   00:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:08:31 compute-0 openstack_network_exporter[199746]: ERROR   00:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:08:31 compute-0 openstack_network_exporter[199746]: ERROR   00:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:08:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:08:31 compute-0 openstack_network_exporter[199746]: ERROR   00:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:08:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.746 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Releasing lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.747 187247 DEBUG nova.compute.manager [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Instance network_info: |[{"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.753 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Start _get_guest_xml network_info=[{"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.760 187247 WARNING nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.763 187247 DEBUG nova.virt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-476622927', uuid='13917c6d-537d-4b86-a989-9ce2df414798'), owner=OwnerMeta(userid='d7f72082c96e4f868d5b158a57237cee', username='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin', projectid='869170c9b0864bd8a0f2258e90e55a84', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720511.762901) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.768 187247 DEBUG nova.virt.libvirt.host [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.770 187247 DEBUG nova.virt.libvirt.host [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.774 187247 DEBUG nova.virt.libvirt.host [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.775 187247 DEBUG nova.virt.libvirt.host [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.777 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.777 187247 DEBUG nova.virt.hardware [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.778 187247 DEBUG nova.virt.hardware [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.779 187247 DEBUG nova.virt.hardware [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.779 187247 DEBUG nova.virt.hardware [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.780 187247 DEBUG nova.virt.hardware [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.780 187247 DEBUG nova.virt.hardware [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.781 187247 DEBUG nova.virt.hardware [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.781 187247 DEBUG nova.virt.hardware [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.782 187247 DEBUG nova.virt.hardware [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.782 187247 DEBUG nova.virt.hardware [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.783 187247 DEBUG nova.virt.hardware [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.790 187247 DEBUG nova.virt.libvirt.vif [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:08:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-476622927',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-476',id=18,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-r72zwvbi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:08:25Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=13917c6d-537d-4b86-a989-9ce2df414798,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.790 187247 DEBUG nova.network.os_vif_util [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converting VIF {"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.792 187247 DEBUG nova.network.os_vif_util [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:08:31 compute-0 nova_compute[187243]: 2025-12-03 00:08:31.793 187247 DEBUG nova.objects.instance [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lazy-loading 'pci_devices' on Instance uuid 13917c6d-537d-4b86-a989-9ce2df414798 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.555 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:08:32 compute-0 nova_compute[187243]:   <uuid>13917c6d-537d-4b86-a989-9ce2df414798</uuid>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   <name>instance-00000012</name>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-476622927</nova:name>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:08:31</nova:creationTime>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:08:32 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:08:32 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:user uuid="d7f72082c96e4f868d5b158a57237cee">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin</nova:user>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:project uuid="869170c9b0864bd8a0f2258e90e55a84">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579</nova:project>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         <nova:port uuid="08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb">
Dec 03 00:08:32 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <system>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <entry name="serial">13917c6d-537d-4b86-a989-9ce2df414798</entry>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <entry name="uuid">13917c6d-537d-4b86-a989-9ce2df414798</entry>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     </system>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   <os>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   </os>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   <features>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   </features>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk.config"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:03:5d:ed"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <target dev="tap08bf4d8e-df"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/console.log" append="off"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <video>
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     </video>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:08:32 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:08:32 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:08:32 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:08:32 compute-0 nova_compute[187243]: </domain>
Dec 03 00:08:32 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.558 187247 DEBUG nova.compute.manager [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Preparing to wait for external event network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.559 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.559 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.560 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.561 187247 DEBUG nova.virt.libvirt.vif [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:08:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-476622927',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-476',id=18,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-r72zwvbi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:08:25Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=13917c6d-537d-4b86-a989-9ce2df414798,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.561 187247 DEBUG nova.network.os_vif_util [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converting VIF {"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.563 187247 DEBUG nova.network.os_vif_util [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.563 187247 DEBUG os_vif [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.564 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.565 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.566 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.567 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.567 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e6dfa855-c9c5-587c-84a9-9e8d9bb572d4', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.569 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.571 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.575 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.576 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08bf4d8e-df, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.576 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap08bf4d8e-df, col_values=(('qos', UUID('4a96656d-293e-467b-beb3-25f746ac12cf')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.577 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap08bf4d8e-df, col_values=(('external_ids', {'iface-id': '08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:5d:ed', 'vm-uuid': '13917c6d-537d-4b86-a989-9ce2df414798'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.579 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:32 compute-0 NetworkManager[55671]: <info>  [1764720512.5810] manager: (tap08bf4d8e-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.582 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.588 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:32 compute-0 nova_compute[187243]: 2025-12-03 00:08:32.589 187247 INFO os_vif [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df')
Dec 03 00:08:33 compute-0 nova_compute[187243]: 2025-12-03 00:08:33.030 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:33 compute-0 podman[216134]: 2025-12-03 00:08:33.17444866 +0000 UTC m=+0.122620218 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:08:34 compute-0 nova_compute[187243]: 2025-12-03 00:08:34.142 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:08:34 compute-0 nova_compute[187243]: 2025-12-03 00:08:34.143 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:08:34 compute-0 nova_compute[187243]: 2025-12-03 00:08:34.144 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] No VIF found with MAC fa:16:3e:03:5d:ed, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:08:34 compute-0 nova_compute[187243]: 2025-12-03 00:08:34.145 187247 INFO nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Using config drive
Dec 03 00:08:34 compute-0 nova_compute[187243]: 2025-12-03 00:08:34.661 187247 WARNING neutronclient.v2_0.client [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:08:35 compute-0 nova_compute[187243]: 2025-12-03 00:08:35.628 187247 INFO nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Creating config drive at /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk.config
Dec 03 00:08:35 compute-0 nova_compute[187243]: 2025-12-03 00:08:35.633 187247 DEBUG oslo_concurrency.processutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpa6022wlb execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:35 compute-0 nova_compute[187243]: 2025-12-03 00:08:35.771 187247 DEBUG oslo_concurrency.processutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpa6022wlb" returned: 0 in 0.138s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:35 compute-0 kernel: tap08bf4d8e-df: entered promiscuous mode
Dec 03 00:08:35 compute-0 NetworkManager[55671]: <info>  [1764720515.8567] manager: (tap08bf4d8e-df): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Dec 03 00:08:35 compute-0 nova_compute[187243]: 2025-12-03 00:08:35.858 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:35 compute-0 ovn_controller[95488]: 2025-12-03T00:08:35Z|00130|binding|INFO|Claiming lport 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb for this chassis.
Dec 03 00:08:35 compute-0 ovn_controller[95488]: 2025-12-03T00:08:35Z|00131|binding|INFO|08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb: Claiming fa:16:3e:03:5d:ed 10.100.0.12
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.868 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:5d:ed 10.100.0.12'], port_security=['fa:16:3e:03:5d:ed 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '13917c6d-537d-4b86-a989-9ce2df414798', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '21025524-a834-4687-a5db-4097a3a2991d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.869 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 bound to our chassis
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.871 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:08:35 compute-0 ovn_controller[95488]: 2025-12-03T00:08:35Z|00132|binding|INFO|Setting lport 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb ovn-installed in OVS
Dec 03 00:08:35 compute-0 ovn_controller[95488]: 2025-12-03T00:08:35Z|00133|binding|INFO|Setting lport 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb up in Southbound
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.883 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7bc5923a-b4da-4c5f-aa44-3d7576965cc4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.883 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9c6ad8f4-61 in ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.885 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9c6ad8f4-60 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.886 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[64c46e72-bf70-4e17-9286-0ea4fdcca837]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:35 compute-0 nova_compute[187243]: 2025-12-03 00:08:35.886 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.887 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c991660f-27b9-4f1c-9873-f21811f316b8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:35 compute-0 nova_compute[187243]: 2025-12-03 00:08:35.892 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.901 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[0a20e575-e984-48a7-9612-6761c1cc3d39]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.907 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8183f67c-4ded-4d40-82f8-5e7836757b1a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:35 compute-0 systemd-udevd[216178]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:08:35 compute-0 systemd-machined[153518]: New machine qemu-11-instance-00000012.
Dec 03 00:08:35 compute-0 NetworkManager[55671]: <info>  [1764720515.9208] device (tap08bf4d8e-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:08:35 compute-0 NetworkManager[55671]: <info>  [1764720515.9223] device (tap08bf4d8e-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:08:35 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-00000012.
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.935 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[93dd91ab-fe98-4f62-814b-43fd7c990a0e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.940 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ddaace3e-ea48-42bc-a203-9531b8f24067]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:35 compute-0 NetworkManager[55671]: <info>  [1764720515.9411] manager: (tap9c6ad8f4-60): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.986 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[fe5fb035-1640-4979-8cc9-bef63d18a2e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:35 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:35.991 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[94ea5762-b4ad-4139-af06-6f934e5ecb59]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:36 compute-0 NetworkManager[55671]: <info>  [1764720516.0157] device (tap9c6ad8f4-60): carrier: link connected
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.022 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[5657c1ae-45a6-4040-a602-bdedc9e521a8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.039 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2f6a6aa5-a3dd-4101-870b-844ba025026d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c6ad8f4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:f8:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460665, 'reachable_time': 20203, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216208, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.055 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2e57ccb9-91ef-4e7e-8df2-08ed4bfbc563]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:f806'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 460665, 'tstamp': 460665}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216209, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.071 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[33f8918a-32c2-440a-9ac4-9ebe3b72b97c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c6ad8f4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:f8:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460665, 'reachable_time': 20203, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216210, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.100 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b3357431-496d-4f33-bd17-e5d6f048bb62]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.153 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[79c80d09-4656-4f9c-8d34-c645ceaf833f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.154 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c6ad8f4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.155 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.155 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c6ad8f4-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:36 compute-0 NetworkManager[55671]: <info>  [1764720516.1571] manager: (tap9c6ad8f4-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Dec 03 00:08:36 compute-0 nova_compute[187243]: 2025-12-03 00:08:36.156 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:36 compute-0 kernel: tap9c6ad8f4-60: entered promiscuous mode
Dec 03 00:08:36 compute-0 nova_compute[187243]: 2025-12-03 00:08:36.158 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.162 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c6ad8f4-60, col_values=(('external_ids', {'iface-id': 'df9da247-f3c2-412c-95a4-9a2562c93dd4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:36 compute-0 nova_compute[187243]: 2025-12-03 00:08:36.162 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:36 compute-0 ovn_controller[95488]: 2025-12-03T00:08:36Z|00134|binding|INFO|Releasing lport df9da247-f3c2-412c-95a4-9a2562c93dd4 from this chassis (sb_readonly=0)
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.180 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d4b13a6a-4cf1-496f-93bc-650d18587c31]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:36 compute-0 nova_compute[187243]: 2025-12-03 00:08:36.181 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.181 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.182 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.182 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.182 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.182 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fc262621-0b54-4492-80bc-f5eb0cd682a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.183 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.183 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c2acaae2-a027-443e-8916-b8fee6bf2f7a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.183 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: global
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: defaults
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     log global
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:08:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:36.185 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'env', 'PROCESS_TAG=haproxy-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:08:36 compute-0 podman[216249]: 2025-12-03 00:08:36.590867934 +0000 UTC m=+0.096903291 container create c7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:08:36 compute-0 podman[216249]: 2025-12-03 00:08:36.517376634 +0000 UTC m=+0.023412041 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:08:36 compute-0 systemd[1]: Started libpod-conmon-c7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1.scope.
Dec 03 00:08:36 compute-0 systemd[1]: Started libcrun container.
Dec 03 00:08:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd2050e70701eccf5fa5e082c56894d16798cf9113bbe8a7b5c68c8f64c69a74/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:08:36 compute-0 podman[216249]: 2025-12-03 00:08:36.667601733 +0000 UTC m=+0.173637110 container init c7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:08:36 compute-0 nova_compute[187243]: 2025-12-03 00:08:36.671 187247 DEBUG nova.compute.manager [req-10a855cf-4f4a-49b1-b9db-8a8579a71cf4 req-d24faefe-c348-42fe-afe6-a91ffb4760c8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:08:36 compute-0 nova_compute[187243]: 2025-12-03 00:08:36.672 187247 DEBUG oslo_concurrency.lockutils [req-10a855cf-4f4a-49b1-b9db-8a8579a71cf4 req-d24faefe-c348-42fe-afe6-a91ffb4760c8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:36 compute-0 nova_compute[187243]: 2025-12-03 00:08:36.672 187247 DEBUG oslo_concurrency.lockutils [req-10a855cf-4f4a-49b1-b9db-8a8579a71cf4 req-d24faefe-c348-42fe-afe6-a91ffb4760c8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:36 compute-0 nova_compute[187243]: 2025-12-03 00:08:36.672 187247 DEBUG oslo_concurrency.lockutils [req-10a855cf-4f4a-49b1-b9db-8a8579a71cf4 req-d24faefe-c348-42fe-afe6-a91ffb4760c8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:36 compute-0 nova_compute[187243]: 2025-12-03 00:08:36.672 187247 DEBUG nova.compute.manager [req-10a855cf-4f4a-49b1-b9db-8a8579a71cf4 req-d24faefe-c348-42fe-afe6-a91ffb4760c8 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Processing event network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:08:36 compute-0 nova_compute[187243]: 2025-12-03 00:08:36.673 187247 DEBUG nova.compute.manager [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:08:36 compute-0 podman[216249]: 2025-12-03 00:08:36.678341835 +0000 UTC m=+0.184377192 container start c7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest)
Dec 03 00:08:36 compute-0 nova_compute[187243]: 2025-12-03 00:08:36.678 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:08:36 compute-0 podman[216262]: 2025-12-03 00:08:36.682629509 +0000 UTC m=+0.061393416 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:08:36 compute-0 nova_compute[187243]: 2025-12-03 00:08:36.684 187247 INFO nova.virt.libvirt.driver [-] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Instance spawned successfully.
Dec 03 00:08:36 compute-0 nova_compute[187243]: 2025-12-03 00:08:36.684 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:08:36 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[216265]: [NOTICE]   (216288) : New worker (216290) forked
Dec 03 00:08:36 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[216265]: [NOTICE]   (216288) : Loading success.
Dec 03 00:08:37 compute-0 nova_compute[187243]: 2025-12-03 00:08:37.103 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:37 compute-0 nova_compute[187243]: 2025-12-03 00:08:37.351 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:08:37 compute-0 nova_compute[187243]: 2025-12-03 00:08:37.352 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:08:37 compute-0 nova_compute[187243]: 2025-12-03 00:08:37.352 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:08:37 compute-0 nova_compute[187243]: 2025-12-03 00:08:37.352 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:08:37 compute-0 nova_compute[187243]: 2025-12-03 00:08:37.353 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:08:37 compute-0 nova_compute[187243]: 2025-12-03 00:08:37.353 187247 DEBUG nova.virt.libvirt.driver [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:08:37 compute-0 nova_compute[187243]: 2025-12-03 00:08:37.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:37 compute-0 nova_compute[187243]: 2025-12-03 00:08:37.618 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:37 compute-0 nova_compute[187243]: 2025-12-03 00:08:37.866 187247 INFO nova.compute.manager [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Took 11.09 seconds to spawn the instance on the hypervisor.
Dec 03 00:08:37 compute-0 nova_compute[187243]: 2025-12-03 00:08:37.866 187247 DEBUG nova.compute.manager [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:08:38 compute-0 nova_compute[187243]: 2025-12-03 00:08:38.031 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:38 compute-0 nova_compute[187243]: 2025-12-03 00:08:38.418 187247 INFO nova.compute.manager [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Took 16.32 seconds to build instance.
Dec 03 00:08:38 compute-0 nova_compute[187243]: 2025-12-03 00:08:38.915 187247 DEBUG nova.compute.manager [req-631e866b-3a38-4adc-a02d-32cf193f8885 req-847c9847-a89a-48e8-b0f3-d1770adf458a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:08:38 compute-0 nova_compute[187243]: 2025-12-03 00:08:38.916 187247 DEBUG oslo_concurrency.lockutils [req-631e866b-3a38-4adc-a02d-32cf193f8885 req-847c9847-a89a-48e8-b0f3-d1770adf458a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:38 compute-0 nova_compute[187243]: 2025-12-03 00:08:38.916 187247 DEBUG oslo_concurrency.lockutils [req-631e866b-3a38-4adc-a02d-32cf193f8885 req-847c9847-a89a-48e8-b0f3-d1770adf458a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:38 compute-0 nova_compute[187243]: 2025-12-03 00:08:38.916 187247 DEBUG oslo_concurrency.lockutils [req-631e866b-3a38-4adc-a02d-32cf193f8885 req-847c9847-a89a-48e8-b0f3-d1770adf458a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:38 compute-0 nova_compute[187243]: 2025-12-03 00:08:38.916 187247 DEBUG nova.compute.manager [req-631e866b-3a38-4adc-a02d-32cf193f8885 req-847c9847-a89a-48e8-b0f3-d1770adf458a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] No waiting events found dispatching network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:08:38 compute-0 nova_compute[187243]: 2025-12-03 00:08:38.917 187247 WARNING nova.compute.manager [req-631e866b-3a38-4adc-a02d-32cf193f8885 req-847c9847-a89a-48e8-b0f3-d1770adf458a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received unexpected event network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb for instance with vm_state active and task_state None.
Dec 03 00:08:38 compute-0 nova_compute[187243]: 2025-12-03 00:08:38.923 187247 DEBUG oslo_concurrency.lockutils [None req-f060d778-bdcf-4fed-b2e2-19bd14450ae8 d7f72082c96e4f868d5b158a57237cee 869170c9b0864bd8a0f2258e90e55a84 - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.834s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:39 compute-0 nova_compute[187243]: 2025-12-03 00:08:39.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:39 compute-0 nova_compute[187243]: 2025-12-03 00:08:39.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:08:42 compute-0 nova_compute[187243]: 2025-12-03 00:08:42.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:42 compute-0 nova_compute[187243]: 2025-12-03 00:08:42.621 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:43 compute-0 nova_compute[187243]: 2025-12-03 00:08:43.034 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:43 compute-0 nova_compute[187243]: 2025-12-03 00:08:43.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:43 compute-0 nova_compute[187243]: 2025-12-03 00:08:43.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:44 compute-0 nova_compute[187243]: 2025-12-03 00:08:44.184 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:44 compute-0 nova_compute[187243]: 2025-12-03 00:08:44.185 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:44 compute-0 nova_compute[187243]: 2025-12-03 00:08:44.186 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:44 compute-0 nova_compute[187243]: 2025-12-03 00:08:44.186 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:08:45 compute-0 nova_compute[187243]: 2025-12-03 00:08:45.448 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:45 compute-0 nova_compute[187243]: 2025-12-03 00:08:45.504 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:45 compute-0 nova_compute[187243]: 2025-12-03 00:08:45.505 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:45 compute-0 nova_compute[187243]: 2025-12-03 00:08:45.560 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:45 compute-0 nova_compute[187243]: 2025-12-03 00:08:45.712 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:08:45 compute-0 nova_compute[187243]: 2025-12-03 00:08:45.714 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:08:45 compute-0 nova_compute[187243]: 2025-12-03 00:08:45.730 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:08:45 compute-0 nova_compute[187243]: 2025-12-03 00:08:45.731 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5650MB free_disk=73.16411972045898GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:08:45 compute-0 nova_compute[187243]: 2025-12-03 00:08:45.731 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:08:45 compute-0 nova_compute[187243]: 2025-12-03 00:08:45.732 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:08:46 compute-0 podman[216308]: 2025-12-03 00:08:46.100133684 +0000 UTC m=+0.048975364 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:08:47 compute-0 nova_compute[187243]: 2025-12-03 00:08:47.064 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance 13917c6d-537d-4b86-a989-9ce2df414798 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:08:47 compute-0 nova_compute[187243]: 2025-12-03 00:08:47.065 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:08:47 compute-0 nova_compute[187243]: 2025-12-03 00:08:47.065 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:08:45 up  1:16,  0 user,  load average: 0.31, 0.25, 0.32\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_869170c9b0864bd8a0f2258e90e55a84': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:08:47 compute-0 nova_compute[187243]: 2025-12-03 00:08:47.133 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:08:47 compute-0 nova_compute[187243]: 2025-12-03 00:08:47.653 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:08:47 compute-0 nova_compute[187243]: 2025-12-03 00:08:47.672 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:48 compute-0 nova_compute[187243]: 2025-12-03 00:08:48.037 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:48 compute-0 nova_compute[187243]: 2025-12-03 00:08:48.241 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:08:48 compute-0 nova_compute[187243]: 2025-12-03 00:08:48.243 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.511s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:08:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:48.311 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:08:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:48.311 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:08:48 compute-0 nova_compute[187243]: 2025-12-03 00:08:48.312 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:49 compute-0 ovn_controller[95488]: 2025-12-03T00:08:49Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:03:5d:ed 10.100.0.12
Dec 03 00:08:49 compute-0 ovn_controller[95488]: 2025-12-03T00:08:49Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:03:5d:ed 10.100.0.12
Dec 03 00:08:49 compute-0 nova_compute[187243]: 2025-12-03 00:08:49.246 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:49 compute-0 nova_compute[187243]: 2025-12-03 00:08:49.247 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:49 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:08:49.312 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:08:49 compute-0 podman[216355]: 2025-12-03 00:08:49.329422881 +0000 UTC m=+0.046914434 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 03 00:08:49 compute-0 podman[216356]: 2025-12-03 00:08:49.366570966 +0000 UTC m=+0.079655912 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest)
Dec 03 00:08:51 compute-0 nova_compute[187243]: 2025-12-03 00:08:51.587 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:08:52 compute-0 nova_compute[187243]: 2025-12-03 00:08:52.674 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:53 compute-0 nova_compute[187243]: 2025-12-03 00:08:53.039 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:57 compute-0 nova_compute[187243]: 2025-12-03 00:08:57.676 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:58 compute-0 nova_compute[187243]: 2025-12-03 00:08:58.041 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:08:59 compute-0 sshd-session[216396]: Received disconnect from 20.123.120.169 port 52658:11: Bye Bye [preauth]
Dec 03 00:08:59 compute-0 sshd-session[216396]: Disconnected from authenticating user root 20.123.120.169 port 52658 [preauth]
Dec 03 00:08:59 compute-0 podman[197600]: time="2025-12-03T00:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:08:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:08:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3065 "" "Go-http-client/1.1"
Dec 03 00:09:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:00.698 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:00.699 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:00.699 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:01 compute-0 openstack_network_exporter[199746]: ERROR   00:09:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:09:01 compute-0 openstack_network_exporter[199746]: ERROR   00:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:09:01 compute-0 openstack_network_exporter[199746]: ERROR   00:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:09:01 compute-0 openstack_network_exporter[199746]: ERROR   00:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:09:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:09:01 compute-0 openstack_network_exporter[199746]: ERROR   00:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:09:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:09:02 compute-0 nova_compute[187243]: 2025-12-03 00:09:02.679 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:03 compute-0 nova_compute[187243]: 2025-12-03 00:09:03.044 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:04 compute-0 podman[216401]: 2025-12-03 00:09:04.169968529 +0000 UTC m=+0.106176597 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350)
Dec 03 00:09:05 compute-0 sshd-session[216399]: Invalid user intel from 102.210.148.92 port 46322
Dec 03 00:09:05 compute-0 sshd-session[216399]: Received disconnect from 102.210.148.92 port 46322:11: Bye Bye [preauth]
Dec 03 00:09:05 compute-0 sshd-session[216399]: Disconnected from invalid user intel 102.210.148.92 port 46322 [preauth]
Dec 03 00:09:07 compute-0 podman[216422]: 2025-12-03 00:09:07.097408792 +0000 UTC m=+0.050830129 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 03 00:09:07 compute-0 nova_compute[187243]: 2025-12-03 00:09:07.683 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:08 compute-0 nova_compute[187243]: 2025-12-03 00:09:08.046 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:08 compute-0 sshd-session[216442]: Invalid user david from 49.247.36.49 port 49274
Dec 03 00:09:08 compute-0 sshd-session[216442]: Received disconnect from 49.247.36.49 port 49274:11: Bye Bye [preauth]
Dec 03 00:09:08 compute-0 sshd-session[216442]: Disconnected from invalid user david 49.247.36.49 port 49274 [preauth]
Dec 03 00:09:12 compute-0 nova_compute[187243]: 2025-12-03 00:09:12.685 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:12 compute-0 sshd-session[216444]: Invalid user minecraft from 45.78.219.213 port 45552
Dec 03 00:09:13 compute-0 nova_compute[187243]: 2025-12-03 00:09:13.048 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:13 compute-0 sshd-session[216444]: Received disconnect from 45.78.219.213 port 45552:11: Bye Bye [preauth]
Dec 03 00:09:13 compute-0 sshd-session[216444]: Disconnected from invalid user minecraft 45.78.219.213 port 45552 [preauth]
Dec 03 00:09:14 compute-0 nova_compute[187243]: 2025-12-03 00:09:14.471 187247 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Check if temp file /var/lib/nova/instances/tmp5i5180fs exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:09:14 compute-0 nova_compute[187243]: 2025-12-03 00:09:14.477 187247 DEBUG nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5i5180fs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='13917c6d-537d-4b86-a989-9ce2df414798',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:09:17 compute-0 podman[216448]: 2025-12-03 00:09:17.130036271 +0000 UTC m=+0.078238067 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:09:17 compute-0 sshd-session[216473]: Invalid user nrk from 23.95.37.90 port 52550
Dec 03 00:09:17 compute-0 sshd-session[216473]: Received disconnect from 23.95.37.90 port 52550:11: Bye Bye [preauth]
Dec 03 00:09:17 compute-0 sshd-session[216473]: Disconnected from invalid user nrk 23.95.37.90 port 52550 [preauth]
Dec 03 00:09:17 compute-0 nova_compute[187243]: 2025-12-03 00:09:17.687 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:18 compute-0 nova_compute[187243]: 2025-12-03 00:09:18.056 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:20 compute-0 podman[216475]: 2025-12-03 00:09:20.105498372 +0000 UTC m=+0.056869376 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:09:20 compute-0 podman[216476]: 2025-12-03 00:09:20.150403046 +0000 UTC m=+0.092473903 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202)
Dec 03 00:09:20 compute-0 nova_compute[187243]: 2025-12-03 00:09:20.543 187247 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:20 compute-0 nova_compute[187243]: 2025-12-03 00:09:20.600 187247 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:20 compute-0 nova_compute[187243]: 2025-12-03 00:09:20.601 187247 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:20 compute-0 nova_compute[187243]: 2025-12-03 00:09:20.653 187247 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:20 compute-0 nova_compute[187243]: 2025-12-03 00:09:20.656 187247 DEBUG nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Preparing to wait for external event network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:09:20 compute-0 nova_compute[187243]: 2025-12-03 00:09:20.656 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:20 compute-0 nova_compute[187243]: 2025-12-03 00:09:20.657 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:20 compute-0 nova_compute[187243]: 2025-12-03 00:09:20.658 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:22 compute-0 sshd-session[216446]: Received disconnect from 45.78.222.160 port 46202:11: Bye Bye [preauth]
Dec 03 00:09:22 compute-0 sshd-session[216446]: Disconnected from authenticating user root 45.78.222.160 port 46202 [preauth]
Dec 03 00:09:22 compute-0 nova_compute[187243]: 2025-12-03 00:09:22.691 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:23 compute-0 nova_compute[187243]: 2025-12-03 00:09:23.058 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:27 compute-0 nova_compute[187243]: 2025-12-03 00:09:27.694 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:27 compute-0 ovn_controller[95488]: 2025-12-03T00:09:27Z|00135|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Dec 03 00:09:28 compute-0 nova_compute[187243]: 2025-12-03 00:09:28.059 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:29 compute-0 podman[197600]: time="2025-12-03T00:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:09:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:09:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3063 "" "Go-http-client/1.1"
Dec 03 00:09:30 compute-0 nova_compute[187243]: 2025-12-03 00:09:30.348 187247 DEBUG nova.compute.manager [req-0774225a-1616-4aa0-bdb3-da3c8cc5962c req-ada8c40f-c98e-45c1-9666-488f2fd589b7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:09:30 compute-0 nova_compute[187243]: 2025-12-03 00:09:30.349 187247 DEBUG oslo_concurrency.lockutils [req-0774225a-1616-4aa0-bdb3-da3c8cc5962c req-ada8c40f-c98e-45c1-9666-488f2fd589b7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:30 compute-0 nova_compute[187243]: 2025-12-03 00:09:30.349 187247 DEBUG oslo_concurrency.lockutils [req-0774225a-1616-4aa0-bdb3-da3c8cc5962c req-ada8c40f-c98e-45c1-9666-488f2fd589b7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:30 compute-0 nova_compute[187243]: 2025-12-03 00:09:30.350 187247 DEBUG oslo_concurrency.lockutils [req-0774225a-1616-4aa0-bdb3-da3c8cc5962c req-ada8c40f-c98e-45c1-9666-488f2fd589b7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:30 compute-0 nova_compute[187243]: 2025-12-03 00:09:30.350 187247 DEBUG nova.compute.manager [req-0774225a-1616-4aa0-bdb3-da3c8cc5962c req-ada8c40f-c98e-45c1-9666-488f2fd589b7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] No event matching network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb in dict_keys([('network-vif-plugged', '08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:09:30 compute-0 nova_compute[187243]: 2025-12-03 00:09:30.350 187247 DEBUG nova.compute.manager [req-0774225a-1616-4aa0-bdb3-da3c8cc5962c req-ada8c40f-c98e-45c1-9666-488f2fd589b7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:09:31 compute-0 openstack_network_exporter[199746]: ERROR   00:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:09:31 compute-0 openstack_network_exporter[199746]: ERROR   00:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:09:31 compute-0 openstack_network_exporter[199746]: ERROR   00:09:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:09:31 compute-0 openstack_network_exporter[199746]: ERROR   00:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:09:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:09:31 compute-0 openstack_network_exporter[199746]: ERROR   00:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:09:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.438 187247 DEBUG nova.compute.manager [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.438 187247 DEBUG oslo_concurrency.lockutils [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.438 187247 DEBUG oslo_concurrency.lockutils [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.439 187247 DEBUG oslo_concurrency.lockutils [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.439 187247 DEBUG nova.compute.manager [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Processing event network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.439 187247 DEBUG nova.compute.manager [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-changed-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.439 187247 DEBUG nova.compute.manager [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Refreshing instance network info cache due to event network-changed-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.439 187247 DEBUG oslo_concurrency.lockutils [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.440 187247 DEBUG oslo_concurrency.lockutils [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.440 187247 DEBUG nova.network.neutron [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Refreshing network info cache for port 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.697 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.970 187247 INFO nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Took 12.31 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.971 187247 DEBUG nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:09:32 compute-0 nova_compute[187243]: 2025-12-03 00:09:32.977 187247 WARNING neutronclient.v2_0.client [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:33 compute-0 nova_compute[187243]: 2025-12-03 00:09:33.061 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:33 compute-0 nova_compute[187243]: 2025-12-03 00:09:33.483 187247 DEBUG nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5i5180fs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='13917c6d-537d-4b86-a989-9ce2df414798',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(0843730c-83f4-417c-a99b-29298db49e9e),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:09:33 compute-0 nova_compute[187243]: 2025-12-03 00:09:33.581 187247 WARNING neutronclient.v2_0.client [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:33 compute-0 nova_compute[187243]: 2025-12-03 00:09:33.743 187247 DEBUG nova.network.neutron [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Updated VIF entry in instance network info cache for port 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:09:33 compute-0 nova_compute[187243]: 2025-12-03 00:09:33.744 187247 DEBUG nova.network.neutron [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Updating instance_info_cache with network_info: [{"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:09:34 compute-0 nova_compute[187243]: 2025-12-03 00:09:34.943 187247 DEBUG nova.objects.instance [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 13917c6d-537d-4b86-a989-9ce2df414798 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:09:34 compute-0 nova_compute[187243]: 2025-12-03 00:09:34.945 187247 DEBUG oslo_concurrency.lockutils [req-c6fb3ff5-4c53-4420-a706-8fdb1ba29a74 req-ec46d591-0c31-4ec1-9e30-4b7015615339 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-13917c6d-537d-4b86-a989-9ce2df414798" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:09:34 compute-0 nova_compute[187243]: 2025-12-03 00:09:34.945 187247 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:09:34 compute-0 nova_compute[187243]: 2025-12-03 00:09:34.946 187247 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:09:34 compute-0 nova_compute[187243]: 2025-12-03 00:09:34.946 187247 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:09:35 compute-0 podman[216530]: 2025-12-03 00:09:35.099605862 +0000 UTC m=+0.059184403 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, name=ubi9-minimal, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:09:35 compute-0 nova_compute[187243]: 2025-12-03 00:09:35.448 187247 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:09:35 compute-0 nova_compute[187243]: 2025-12-03 00:09:35.448 187247 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:09:35 compute-0 nova_compute[187243]: 2025-12-03 00:09:35.591 187247 DEBUG nova.virt.libvirt.vif [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:08:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-476622927',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-476',id=18,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:08:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-r72zwvbi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:08:37Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=13917c6d-537d-4b86-a989-9ce2df414798,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:09:35 compute-0 nova_compute[187243]: 2025-12-03 00:09:35.591 187247 DEBUG nova.network.os_vif_util [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:09:35 compute-0 nova_compute[187243]: 2025-12-03 00:09:35.592 187247 DEBUG nova.network.os_vif_util [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:09:35 compute-0 nova_compute[187243]: 2025-12-03 00:09:35.592 187247 DEBUG nova.virt.libvirt.migration [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:03:5d:ed"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <target dev="tap08bf4d8e-df"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]: </interface>
Dec 03 00:09:35 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:09:35 compute-0 nova_compute[187243]: 2025-12-03 00:09:35.593 187247 DEBUG nova.virt.libvirt.migration [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <name>instance-00000012</name>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <uuid>13917c6d-537d-4b86-a989-9ce2df414798</uuid>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-476622927</nova:name>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:08:31</nova:creationTime>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:09:35 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:09:35 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:user uuid="d7f72082c96e4f868d5b158a57237cee">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin</nova:user>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:project uuid="869170c9b0864bd8a0f2258e90e55a84">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579</nova:project>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:port uuid="08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb">
Dec 03 00:09:35 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <system>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="serial">13917c6d-537d-4b86-a989-9ce2df414798</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="uuid">13917c6d-537d-4b86-a989-9ce2df414798</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </system>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <os>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </os>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <features>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </features>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk.config"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:03:5d:ed"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap08bf4d8e-df"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/console.log" append="off"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </target>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/console.log" append="off"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </console>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </input>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <video>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </video>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]: </domain>
Dec 03 00:09:35 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:09:35 compute-0 nova_compute[187243]: 2025-12-03 00:09:35.593 187247 DEBUG nova.virt.libvirt.migration [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <name>instance-00000012</name>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <uuid>13917c6d-537d-4b86-a989-9ce2df414798</uuid>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-476622927</nova:name>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:08:31</nova:creationTime>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:09:35 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:09:35 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:user uuid="d7f72082c96e4f868d5b158a57237cee">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin</nova:user>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:project uuid="869170c9b0864bd8a0f2258e90e55a84">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579</nova:project>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:port uuid="08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb">
Dec 03 00:09:35 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <system>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="serial">13917c6d-537d-4b86-a989-9ce2df414798</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="uuid">13917c6d-537d-4b86-a989-9ce2df414798</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </system>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <os>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </os>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <features>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </features>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk.config"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:03:5d:ed"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap08bf4d8e-df"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/console.log" append="off"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </target>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/console.log" append="off"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </console>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </input>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <video>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </video>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]: </domain>
Dec 03 00:09:35 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:09:35 compute-0 nova_compute[187243]: 2025-12-03 00:09:35.594 187247 DEBUG nova.virt.libvirt.migration [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <name>instance-00000012</name>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <uuid>13917c6d-537d-4b86-a989-9ce2df414798</uuid>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-476622927</nova:name>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:08:31</nova:creationTime>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:09:35 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:09:35 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:user uuid="d7f72082c96e4f868d5b158a57237cee">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin</nova:user>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:project uuid="869170c9b0864bd8a0f2258e90e55a84">tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579</nova:project>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <nova:port uuid="08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb">
Dec 03 00:09:35 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <system>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="serial">13917c6d-537d-4b86-a989-9ce2df414798</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="uuid">13917c6d-537d-4b86-a989-9ce2df414798</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </system>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <os>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </os>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <features>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </features>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/disk.config"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:03:5d:ed"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap08bf4d8e-df"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/console.log" append="off"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:09:35 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       </target>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798/console.log" append="off"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </console>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </input>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <video>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </video>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:09:35 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:09:35 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:09:35 compute-0 nova_compute[187243]: </domain>
Dec 03 00:09:35 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:09:35 compute-0 nova_compute[187243]: 2025-12-03 00:09:35.594 187247 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:09:35 compute-0 nova_compute[187243]: 2025-12-03 00:09:35.951 187247 DEBUG nova.virt.libvirt.migration [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:09:35 compute-0 nova_compute[187243]: 2025-12-03 00:09:35.951 187247 INFO nova.virt.libvirt.migration [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:09:37 compute-0 nova_compute[187243]: 2025-12-03 00:09:37.700 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:37 compute-0 nova_compute[187243]: 2025-12-03 00:09:37.984 187247 INFO nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.063 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:38 compute-0 podman[216564]: 2025-12-03 00:09:38.099446448 +0000 UTC m=+0.055572034 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd)
Dec 03 00:09:38 compute-0 kernel: tap08bf4d8e-df (unregistering): left promiscuous mode
Dec 03 00:09:38 compute-0 NetworkManager[55671]: <info>  [1764720578.3207] device (tap08bf4d8e-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.329 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:38 compute-0 ovn_controller[95488]: 2025-12-03T00:09:38Z|00136|binding|INFO|Releasing lport 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb from this chassis (sb_readonly=0)
Dec 03 00:09:38 compute-0 ovn_controller[95488]: 2025-12-03T00:09:38Z|00137|binding|INFO|Setting lport 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb down in Southbound
Dec 03 00:09:38 compute-0 ovn_controller[95488]: 2025-12-03T00:09:38Z|00138|binding|INFO|Removing iface tap08bf4d8e-df ovn-installed in OVS
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.330 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.348 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:38 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000012.scope: Deactivated successfully.
Dec 03 00:09:38 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000012.scope: Consumed 14.527s CPU time.
Dec 03 00:09:38 compute-0 systemd-machined[153518]: Machine qemu-11-instance-00000012 terminated.
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.391 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:5d:ed 10.100.0.12'], port_security=['fa:16:3e:03:5d:ed 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '13917c6d-537d-4b86-a989-9ce2df414798', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '869170c9b0864bd8a0f2258e90e55a84', 'neutron:revision_number': '10', 'neutron:security_group_ids': '21025524-a834-4687-a5db-4097a3a2991d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fb2dc55-b9aa-4540-a79d-797e2b8e81ae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.392 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb in datapath 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 unbound from our chassis
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.393 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.394 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[747f3a75-c2fa-4cfe-8331-851be38eb38e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.394 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 namespace which is not needed anymore
Dec 03 00:09:38 compute-0 sshd-session[216559]: Invalid user vncuser from 61.220.235.10 port 60656
Dec 03 00:09:38 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[216265]: [NOTICE]   (216288) : haproxy version is 3.0.5-8e879a5
Dec 03 00:09:38 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[216265]: [NOTICE]   (216288) : path to executable is /usr/sbin/haproxy
Dec 03 00:09:38 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[216265]: [WARNING]  (216288) : Exiting Master process...
Dec 03 00:09:38 compute-0 podman[216608]: 2025-12-03 00:09:38.501651166 +0000 UTC m=+0.030296299 container kill c7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Dec 03 00:09:38 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[216265]: [ALERT]    (216288) : Current worker (216290) exited with code 143 (Terminated)
Dec 03 00:09:38 compute-0 neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68[216265]: [WARNING]  (216288) : All workers exited. Exiting... (0)
Dec 03 00:09:38 compute-0 systemd[1]: libpod-c7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1.scope: Deactivated successfully.
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.520 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.524 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:38 compute-0 podman[216623]: 2025-12-03 00:09:38.539834036 +0000 UTC m=+0.021016653 container died c7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.555 187247 DEBUG nova.virt.libvirt.guest [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.557 187247 INFO nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Migration operation has completed
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.559 187247 INFO nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] _post_live_migration() is started..
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.561 187247 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.561 187247 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.562 187247 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.576 187247 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.576 187247 WARNING neutronclient.v2_0.client [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1-userdata-shm.mount: Deactivated successfully.
Dec 03 00:09:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-cd2050e70701eccf5fa5e082c56894d16798cf9113bbe8a7b5c68c8f64c69a74-merged.mount: Deactivated successfully.
Dec 03 00:09:38 compute-0 sshd-session[216559]: Received disconnect from 61.220.235.10 port 60656:11: Bye Bye [preauth]
Dec 03 00:09:38 compute-0 sshd-session[216559]: Disconnected from invalid user vncuser 61.220.235.10 port 60656 [preauth]
Dec 03 00:09:38 compute-0 podman[216623]: 2025-12-03 00:09:38.651991788 +0000 UTC m=+0.133174385 container cleanup c7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Dec 03 00:09:38 compute-0 systemd[1]: libpod-conmon-c7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1.scope: Deactivated successfully.
Dec 03 00:09:38 compute-0 podman[216627]: 2025-12-03 00:09:38.717138756 +0000 UTC m=+0.190246526 container remove c7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.723 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[36aaa3b0-4fb3-441b-ba4d-fd6538e25241]: (4, ("Wed Dec  3 12:09:38 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 (c7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1)\nc7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1\nWed Dec  3 12:09:38 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 (c7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1)\nc7dd607bdf1a22ab2314be2f884c25319b320e634a6f7cce6597360dee1daed1\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.724 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[19d5c752-75f8-4180-999f-628038f649ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.724 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c6ad8f4-62a9-4a0d-ac57-e980ee855c68.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.725 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[88edb2a9-655a-4314-8c37-4892cdb570fd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.725 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c6ad8f4-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.727 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:38 compute-0 kernel: tap9c6ad8f4-60: left promiscuous mode
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.743 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.746 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2dc17386-26bf-403b-8af8-7c19a6704cf5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.761 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[212d0b64-e857-4b96-897e-13fb4e82d18b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.762 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[50b3378b-76ba-40a4-8b2e-5e724ac37720]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.777 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[46ddcd42-3033-4745-b52f-ec3766361c0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460657, 'reachable_time': 40715, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216675, 'error': None, 'target': 'ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.778 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9c6ad8f4-62a9-4a0d-ac57-e980ee855c68 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:09:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:38.779 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[edf7444d-6c66-404d-a932-60bd4fcb59ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:09:38 compute-0 systemd[1]: run-netns-ovnmeta\x2d9c6ad8f4\x2d62a9\x2d4a0d\x2dac57\x2de980ee855c68.mount: Deactivated successfully.
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.781 187247 DEBUG nova.compute.manager [req-49d1ed89-b770-4dc0-99e8-389eb9fe5761 req-7ae85627-3a6c-418f-986d-318de46ab3f6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.782 187247 DEBUG oslo_concurrency.lockutils [req-49d1ed89-b770-4dc0-99e8-389eb9fe5761 req-7ae85627-3a6c-418f-986d-318de46ab3f6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.782 187247 DEBUG oslo_concurrency.lockutils [req-49d1ed89-b770-4dc0-99e8-389eb9fe5761 req-7ae85627-3a6c-418f-986d-318de46ab3f6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.782 187247 DEBUG oslo_concurrency.lockutils [req-49d1ed89-b770-4dc0-99e8-389eb9fe5761 req-7ae85627-3a6c-418f-986d-318de46ab3f6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.782 187247 DEBUG nova.compute.manager [req-49d1ed89-b770-4dc0-99e8-389eb9fe5761 req-7ae85627-3a6c-418f-986d-318de46ab3f6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] No waiting events found dispatching network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:09:38 compute-0 nova_compute[187243]: 2025-12-03 00:09:38.782 187247 DEBUG nova.compute.manager [req-49d1ed89-b770-4dc0-99e8-389eb9fe5761 req-7ae85627-3a6c-418f-986d-318de46ab3f6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:09:39 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:39.099 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:09:39 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:39.099 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.100 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.184 187247 DEBUG nova.network.neutron [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port 08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.185 187247 DEBUG nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.186 187247 DEBUG nova.virt.libvirt.vif [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:08:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-476622927',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-476',id=18,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:08:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='869170c9b0864bd8a0f2258e90e55a84',ramdisk_id='',reservation_id='r-r72zwvbi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1547126579-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:09:08Z,user_data=None,user_id='d7f72082c96e4f868d5b158a57237cee',uuid=13917c6d-537d-4b86-a989-9ce2df414798,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.186 187247 DEBUG nova.network.os_vif_util [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "address": "fa:16:3e:03:5d:ed", "network": {"id": "9c6ad8f4-62a9-4a0d-ac57-e980ee855c68", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-363933516-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b356b9112e0c4e6083f56fc1c7796972", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08bf4d8e-df", "ovs_interfaceid": "08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.186 187247 DEBUG nova.network.os_vif_util [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.187 187247 DEBUG os_vif [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.188 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.188 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08bf4d8e-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.189 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.190 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.191 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.191 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4a96656d-293e-467b-beb3-25f746ac12cf) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.191 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.192 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.194 187247 INFO os_vif [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:5d:ed,bridge_name='br-int',has_traffic_filtering=True,id=08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb,network=Network(9c6ad8f4-62a9-4a0d-ac57-e980ee855c68),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08bf4d8e-df')
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.194 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.194 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.194 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.194 187247 DEBUG nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.195 187247 INFO nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Deleting instance files /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798_del
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.195 187247 INFO nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Deletion of /var/lib/nova/instances/13917c6d-537d-4b86-a989-9ce2df414798_del complete
Dec 03 00:09:39 compute-0 nova_compute[187243]: 2025-12-03 00:09:39.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.848 187247 DEBUG nova.compute.manager [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.849 187247 DEBUG oslo_concurrency.lockutils [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.849 187247 DEBUG oslo_concurrency.lockutils [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.849 187247 DEBUG oslo_concurrency.lockutils [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.850 187247 DEBUG nova.compute.manager [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] No waiting events found dispatching network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.850 187247 WARNING nova.compute.manager [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received unexpected event network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb for instance with vm_state active and task_state migrating.
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.850 187247 DEBUG nova.compute.manager [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.850 187247 DEBUG oslo_concurrency.lockutils [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.850 187247 DEBUG oslo_concurrency.lockutils [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.851 187247 DEBUG oslo_concurrency.lockutils [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.851 187247 DEBUG nova.compute.manager [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] No waiting events found dispatching network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.851 187247 DEBUG nova.compute.manager [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-unplugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.852 187247 DEBUG nova.compute.manager [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received event network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.852 187247 DEBUG oslo_concurrency.lockutils [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.852 187247 DEBUG oslo_concurrency.lockutils [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.852 187247 DEBUG oslo_concurrency.lockutils [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.852 187247 DEBUG nova.compute.manager [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] No waiting events found dispatching network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:09:40 compute-0 nova_compute[187243]: 2025-12-03 00:09:40.853 187247 WARNING nova.compute.manager [req-b214ba0b-0e8f-43a4-95bd-20dfe4369a79 req-5c2165c7-5ace-4969-b6f1-ba1c4802d823 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Received unexpected event network-vif-plugged-08bf4d8e-dfd4-4e39-b1c5-a9f52488afbb for instance with vm_state active and task_state migrating.
Dec 03 00:09:43 compute-0 nova_compute[187243]: 2025-12-03 00:09:43.065 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:43 compute-0 nova_compute[187243]: 2025-12-03 00:09:43.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:09:44.101 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:09:44 compute-0 nova_compute[187243]: 2025-12-03 00:09:44.154 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:44 compute-0 nova_compute[187243]: 2025-12-03 00:09:44.155 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:44 compute-0 nova_compute[187243]: 2025-12-03 00:09:44.155 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:44 compute-0 nova_compute[187243]: 2025-12-03 00:09:44.155 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:09:44 compute-0 nova_compute[187243]: 2025-12-03 00:09:44.192 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:44 compute-0 nova_compute[187243]: 2025-12-03 00:09:44.288 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:09:44 compute-0 nova_compute[187243]: 2025-12-03 00:09:44.289 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:44 compute-0 nova_compute[187243]: 2025-12-03 00:09:44.306 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:44 compute-0 nova_compute[187243]: 2025-12-03 00:09:44.307 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5833MB free_disk=73.1649055480957GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:09:44 compute-0 nova_compute[187243]: 2025-12-03 00:09:44.307 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:44 compute-0 nova_compute[187243]: 2025-12-03 00:09:44.307 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:45 compute-0 nova_compute[187243]: 2025-12-03 00:09:45.349 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Updating resource usage from migration 0843730c-83f4-417c-a99b-29298db49e9e
Dec 03 00:09:45 compute-0 nova_compute[187243]: 2025-12-03 00:09:45.385 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration 0843730c-83f4-417c-a99b-29298db49e9e is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:09:45 compute-0 nova_compute[187243]: 2025-12-03 00:09:45.386 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:09:45 compute-0 nova_compute[187243]: 2025-12-03 00:09:45.386 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:09:44 up  1:17,  0 user,  load average: 0.17, 0.22, 0.30\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_869170c9b0864bd8a0f2258e90e55a84': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:09:45 compute-0 nova_compute[187243]: 2025-12-03 00:09:45.468 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:09:48 compute-0 nova_compute[187243]: 2025-12-03 00:09:48.068 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:48 compute-0 podman[216679]: 2025-12-03 00:09:48.097653276 +0000 UTC m=+0.052368337 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:09:49 compute-0 nova_compute[187243]: 2025-12-03 00:09:49.193 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:49 compute-0 nova_compute[187243]: 2025-12-03 00:09:49.209 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:09:49 compute-0 nova_compute[187243]: 2025-12-03 00:09:49.740 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:09:49 compute-0 nova_compute[187243]: 2025-12-03 00:09:49.741 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.433s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:50 compute-0 nova_compute[187243]: 2025-12-03 00:09:50.736 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:50 compute-0 nova_compute[187243]: 2025-12-03 00:09:50.737 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:50 compute-0 nova_compute[187243]: 2025-12-03 00:09:50.737 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:50 compute-0 nova_compute[187243]: 2025-12-03 00:09:50.737 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:09:51 compute-0 podman[216704]: 2025-12-03 00:09:51.09331924 +0000 UTC m=+0.051068235 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 03 00:09:51 compute-0 podman[216705]: 2025-12-03 00:09:51.123371882 +0000 UTC m=+0.078810581 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 00:09:51 compute-0 nova_compute[187243]: 2025-12-03 00:09:51.785 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "13917c6d-537d-4b86-a989-9ce2df414798-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:51 compute-0 nova_compute[187243]: 2025-12-03 00:09:51.786 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:51 compute-0 nova_compute[187243]: 2025-12-03 00:09:51.786 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "13917c6d-537d-4b86-a989-9ce2df414798-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:52 compute-0 nova_compute[187243]: 2025-12-03 00:09:52.297 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:52 compute-0 nova_compute[187243]: 2025-12-03 00:09:52.297 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:52 compute-0 nova_compute[187243]: 2025-12-03 00:09:52.298 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:52 compute-0 nova_compute[187243]: 2025-12-03 00:09:52.298 187247 DEBUG nova.compute.resource_tracker [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:09:52 compute-0 nova_compute[187243]: 2025-12-03 00:09:52.416 187247 WARNING nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:09:52 compute-0 nova_compute[187243]: 2025-12-03 00:09:52.417 187247 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:09:52 compute-0 nova_compute[187243]: 2025-12-03 00:09:52.434 187247 DEBUG oslo_concurrency.processutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:09:52 compute-0 nova_compute[187243]: 2025-12-03 00:09:52.435 187247 DEBUG nova.compute.resource_tracker [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5835MB free_disk=73.16492462158203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:09:52 compute-0 nova_compute[187243]: 2025-12-03 00:09:52.435 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:09:52 compute-0 nova_compute[187243]: 2025-12-03 00:09:52.435 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:09:53 compute-0 nova_compute[187243]: 2025-12-03 00:09:53.070 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:53 compute-0 nova_compute[187243]: 2025-12-03 00:09:53.453 187247 DEBUG nova.compute.resource_tracker [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance 13917c6d-537d-4b86-a989-9ce2df414798 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:09:53 compute-0 nova_compute[187243]: 2025-12-03 00:09:53.960 187247 DEBUG nova.compute.resource_tracker [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:09:53 compute-0 nova_compute[187243]: 2025-12-03 00:09:53.984 187247 DEBUG nova.compute.resource_tracker [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration 0843730c-83f4-417c-a99b-29298db49e9e is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:09:53 compute-0 nova_compute[187243]: 2025-12-03 00:09:53.984 187247 DEBUG nova.compute.resource_tracker [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:09:53 compute-0 nova_compute[187243]: 2025-12-03 00:09:53.984 187247 DEBUG nova.compute.resource_tracker [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:09:52 up  1:18,  0 user,  load average: 0.14, 0.21, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:09:54 compute-0 nova_compute[187243]: 2025-12-03 00:09:54.017 187247 DEBUG nova.compute.provider_tree [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:09:54 compute-0 nova_compute[187243]: 2025-12-03 00:09:54.194 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:55 compute-0 nova_compute[187243]: 2025-12-03 00:09:55.213 187247 DEBUG nova.scheduler.client.report [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:09:55 compute-0 nova_compute[187243]: 2025-12-03 00:09:55.879 187247 DEBUG nova.compute.resource_tracker [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:09:55 compute-0 nova_compute[187243]: 2025-12-03 00:09:55.880 187247 DEBUG oslo_concurrency.lockutils [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.445s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:09:55 compute-0 nova_compute[187243]: 2025-12-03 00:09:55.895 187247 INFO nova.compute.manager [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 03 00:09:57 compute-0 nova_compute[187243]: 2025-12-03 00:09:57.395 187247 INFO nova.scheduler.client.report [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration 0843730c-83f4-417c-a99b-29298db49e9e
Dec 03 00:09:57 compute-0 nova_compute[187243]: 2025-12-03 00:09:57.396 187247 DEBUG nova.virt.libvirt.driver [None req-87172c39-c9c5-4f3b-a8a5-5a75d435b50f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 13917c6d-537d-4b86-a989-9ce2df414798] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:09:58 compute-0 nova_compute[187243]: 2025-12-03 00:09:58.070 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:59 compute-0 nova_compute[187243]: 2025-12-03 00:09:59.197 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:09:59 compute-0 podman[197600]: time="2025-12-03T00:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:09:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:09:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec 03 00:10:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:10:00.700 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:10:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:10:00.701 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:10:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:10:00.701 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:10:01 compute-0 openstack_network_exporter[199746]: ERROR   00:10:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:10:01 compute-0 openstack_network_exporter[199746]: ERROR   00:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:10:01 compute-0 openstack_network_exporter[199746]: ERROR   00:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:10:01 compute-0 openstack_network_exporter[199746]: ERROR   00:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:10:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:10:01 compute-0 openstack_network_exporter[199746]: ERROR   00:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:10:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:10:03 compute-0 nova_compute[187243]: 2025-12-03 00:10:03.078 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:04 compute-0 nova_compute[187243]: 2025-12-03 00:10:04.210 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:06 compute-0 podman[216751]: 2025-12-03 00:10:06.143485503 +0000 UTC m=+0.085167446 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 03 00:10:08 compute-0 nova_compute[187243]: 2025-12-03 00:10:08.080 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:09 compute-0 podman[216773]: 2025-12-03 00:10:09.12280719 +0000 UTC m=+0.070977680 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:10:09 compute-0 nova_compute[187243]: 2025-12-03 00:10:09.245 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:13 compute-0 nova_compute[187243]: 2025-12-03 00:10:13.082 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:14 compute-0 nova_compute[187243]: 2025-12-03 00:10:14.247 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:15 compute-0 sshd-session[216794]: Invalid user desliga from 102.210.148.92 port 43672
Dec 03 00:10:16 compute-0 sshd-session[216794]: Received disconnect from 102.210.148.92 port 43672:11: Bye Bye [preauth]
Dec 03 00:10:16 compute-0 sshd-session[216794]: Disconnected from invalid user desliga 102.210.148.92 port 43672 [preauth]
Dec 03 00:10:17 compute-0 nova_compute[187243]: 2025-12-03 00:10:17.784 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:18 compute-0 nova_compute[187243]: 2025-12-03 00:10:18.086 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:19 compute-0 podman[216796]: 2025-12-03 00:10:19.096391428 +0000 UTC m=+0.049589499 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:10:19 compute-0 nova_compute[187243]: 2025-12-03 00:10:19.298 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:22 compute-0 podman[216823]: 2025-12-03 00:10:22.091376737 +0000 UTC m=+0.048933624 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 00:10:22 compute-0 podman[216824]: 2025-12-03 00:10:22.117472842 +0000 UTC m=+0.072399724 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 03 00:10:23 compute-0 nova_compute[187243]: 2025-12-03 00:10:23.089 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:24 compute-0 nova_compute[187243]: 2025-12-03 00:10:24.300 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:27 compute-0 sshd-session[216866]: Invalid user test from 20.123.120.169 port 57400
Dec 03 00:10:27 compute-0 sshd-session[216866]: Received disconnect from 20.123.120.169 port 57400:11: Bye Bye [preauth]
Dec 03 00:10:27 compute-0 sshd-session[216866]: Disconnected from invalid user test 20.123.120.169 port 57400 [preauth]
Dec 03 00:10:28 compute-0 nova_compute[187243]: 2025-12-03 00:10:28.089 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:29 compute-0 nova_compute[187243]: 2025-12-03 00:10:29.302 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:29 compute-0 podman[197600]: time="2025-12-03T00:10:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:10:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:10:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:10:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:10:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec 03 00:10:31 compute-0 openstack_network_exporter[199746]: ERROR   00:10:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:10:31 compute-0 openstack_network_exporter[199746]: ERROR   00:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:10:31 compute-0 openstack_network_exporter[199746]: ERROR   00:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:10:31 compute-0 openstack_network_exporter[199746]: ERROR   00:10:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:10:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:10:31 compute-0 openstack_network_exporter[199746]: ERROR   00:10:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:10:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:10:33 compute-0 nova_compute[187243]: 2025-12-03 00:10:33.129 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:34 compute-0 nova_compute[187243]: 2025-12-03 00:10:34.366 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:36 compute-0 sshd-session[216868]: Received disconnect from 49.247.36.49 port 48409:11: Bye Bye [preauth]
Dec 03 00:10:36 compute-0 sshd-session[216868]: Disconnected from authenticating user root 49.247.36.49 port 48409 [preauth]
Dec 03 00:10:37 compute-0 podman[216872]: 2025-12-03 00:10:37.090360544 +0000 UTC m=+0.052461589 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, name=ubi9-minimal)
Dec 03 00:10:37 compute-0 sshd-session[216870]: Received disconnect from 23.95.37.90 port 40064:11: Bye Bye [preauth]
Dec 03 00:10:37 compute-0 sshd-session[216870]: Disconnected from authenticating user root 23.95.37.90 port 40064 [preauth]
Dec 03 00:10:38 compute-0 nova_compute[187243]: 2025-12-03 00:10:38.131 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:39 compute-0 nova_compute[187243]: 2025-12-03 00:10:39.372 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:40 compute-0 podman[216893]: 2025-12-03 00:10:40.09303276 +0000 UTC m=+0.049725493 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 00:10:40 compute-0 nova_compute[187243]: 2025-12-03 00:10:40.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:40 compute-0 nova_compute[187243]: 2025-12-03 00:10:40.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:40 compute-0 nova_compute[187243]: 2025-12-03 00:10:40.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:40 compute-0 nova_compute[187243]: 2025-12-03 00:10:40.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:10:43 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:10:43.148 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:10:43 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:10:43.148 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:10:43 compute-0 nova_compute[187243]: 2025-12-03 00:10:43.169 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:44 compute-0 nova_compute[187243]: 2025-12-03 00:10:44.411 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:10:45.150 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:10:45 compute-0 nova_compute[187243]: 2025-12-03 00:10:45.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:45 compute-0 nova_compute[187243]: 2025-12-03 00:10:45.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:46 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:10:46.768 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:a0:ba 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05a149bd8b504e438531bb5b9409e4db', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8f4f3fe-a4b0-48ff-9b01-d63a8cee7576, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c9c5ad1f-d82d-4b26-aa35-4c2bd8e4a10c) old=Port_Binding(mac=['fa:16:3e:63:a0:ba'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05a149bd8b504e438531bb5b9409e4db', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:10:46 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:10:46.770 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c9c5ad1f-d82d-4b26-aa35-4c2bd8e4a10c in datapath 44651134-dca8-45c2-963a-1f17aac67593 updated
Dec 03 00:10:46 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:10:46.771 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44651134-dca8-45c2-963a-1f17aac67593, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:10:46 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:10:46.772 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[36af4221-85a4-453b-bc0c-9eb1a1dfe0f6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:10:46 compute-0 nova_compute[187243]: 2025-12-03 00:10:46.982 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:10:46 compute-0 nova_compute[187243]: 2025-12-03 00:10:46.983 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:10:46 compute-0 nova_compute[187243]: 2025-12-03 00:10:46.983 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:10:46 compute-0 nova_compute[187243]: 2025-12-03 00:10:46.983 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:10:47 compute-0 nova_compute[187243]: 2025-12-03 00:10:47.105 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:10:47 compute-0 nova_compute[187243]: 2025-12-03 00:10:47.106 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:10:47 compute-0 nova_compute[187243]: 2025-12-03 00:10:47.124 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:10:47 compute-0 nova_compute[187243]: 2025-12-03 00:10:47.124 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5832MB free_disk=73.16250610351562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:10:47 compute-0 nova_compute[187243]: 2025-12-03 00:10:47.125 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:10:47 compute-0 nova_compute[187243]: 2025-12-03 00:10:47.125 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:10:48 compute-0 nova_compute[187243]: 2025-12-03 00:10:48.170 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:49 compute-0 nova_compute[187243]: 2025-12-03 00:10:49.413 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:49 compute-0 nova_compute[187243]: 2025-12-03 00:10:49.856 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:10:49 compute-0 nova_compute[187243]: 2025-12-03 00:10:49.856 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:10:47 up  1:18,  0 user,  load average: 0.05, 0.17, 0.28\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:10:49 compute-0 nova_compute[187243]: 2025-12-03 00:10:49.876 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:10:50 compute-0 podman[216917]: 2025-12-03 00:10:50.086517082 +0000 UTC m=+0.048109983 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:10:50 compute-0 nova_compute[187243]: 2025-12-03 00:10:50.384 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:10:50 compute-0 nova_compute[187243]: 2025-12-03 00:10:50.891 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:10:50 compute-0 nova_compute[187243]: 2025-12-03 00:10:50.891 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.766s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:10:51 compute-0 nova_compute[187243]: 2025-12-03 00:10:51.887 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:51 compute-0 nova_compute[187243]: 2025-12-03 00:10:51.887 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:51 compute-0 nova_compute[187243]: 2025-12-03 00:10:51.887 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:53 compute-0 podman[216941]: 2025-12-03 00:10:53.090338575 +0000 UTC m=+0.052374767 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4)
Dec 03 00:10:53 compute-0 podman[216942]: 2025-12-03 00:10:53.119316751 +0000 UTC m=+0.077242692 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:10:53 compute-0 nova_compute[187243]: 2025-12-03 00:10:53.171 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:54 compute-0 nova_compute[187243]: 2025-12-03 00:10:54.415 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:56 compute-0 nova_compute[187243]: 2025-12-03 00:10:56.587 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:10:57 compute-0 ovn_controller[95488]: 2025-12-03T00:10:57Z|00139|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 03 00:10:58 compute-0 nova_compute[187243]: 2025-12-03 00:10:58.172 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:59 compute-0 nova_compute[187243]: 2025-12-03 00:10:59.417 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:10:59 compute-0 podman[197600]: time="2025-12-03T00:10:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:10:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:10:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:10:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:10:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Dec 03 00:11:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:00.702 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:00.703 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:00.703 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:01 compute-0 openstack_network_exporter[199746]: ERROR   00:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:11:01 compute-0 openstack_network_exporter[199746]: ERROR   00:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:11:01 compute-0 openstack_network_exporter[199746]: ERROR   00:11:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:11:01 compute-0 openstack_network_exporter[199746]: ERROR   00:11:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:11:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:11:01 compute-0 openstack_network_exporter[199746]: ERROR   00:11:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:11:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:11:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:01.957 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:3f:ab 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a4a14d90-e145-46fe-ae48-a3de49800b87', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a4a14d90-e145-46fe-ae48-a3de49800b87', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8618acb8fd774a27ac00f4e0f10b934c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=06e69b18-9531-430d-9cad-90848fbfd86e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e5672078-29f7-493b-a2d0-b68ca62fdf76) old=Port_Binding(mac=['fa:16:3e:7c:3f:ab'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a4a14d90-e145-46fe-ae48-a3de49800b87', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a4a14d90-e145-46fe-ae48-a3de49800b87', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8618acb8fd774a27ac00f4e0f10b934c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:11:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:01.958 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e5672078-29f7-493b-a2d0-b68ca62fdf76 in datapath a4a14d90-e145-46fe-ae48-a3de49800b87 updated
Dec 03 00:11:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:01.959 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a4a14d90-e145-46fe-ae48-a3de49800b87, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:11:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:01.960 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[436bbffb-1876-4c4b-a4d5-06af48a0756e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:03 compute-0 nova_compute[187243]: 2025-12-03 00:11:03.174 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:03 compute-0 sshd-session[216987]: Invalid user deploy from 61.220.235.10 port 59810
Dec 03 00:11:03 compute-0 sshd-session[216987]: Received disconnect from 61.220.235.10 port 59810:11: Bye Bye [preauth]
Dec 03 00:11:03 compute-0 sshd-session[216987]: Disconnected from invalid user deploy 61.220.235.10 port 59810 [preauth]
Dec 03 00:11:04 compute-0 nova_compute[187243]: 2025-12-03 00:11:04.419 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:08 compute-0 podman[216989]: 2025-12-03 00:11:08.091315011 +0000 UTC m=+0.053095555 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 03 00:11:08 compute-0 nova_compute[187243]: 2025-12-03 00:11:08.174 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:09 compute-0 nova_compute[187243]: 2025-12-03 00:11:09.422 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:09 compute-0 nova_compute[187243]: 2025-12-03 00:11:09.776 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:09 compute-0 nova_compute[187243]: 2025-12-03 00:11:09.776 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:10 compute-0 nova_compute[187243]: 2025-12-03 00:11:10.282 187247 DEBUG nova.compute.manager [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:11:10 compute-0 nova_compute[187243]: 2025-12-03 00:11:10.828 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:10 compute-0 nova_compute[187243]: 2025-12-03 00:11:10.828 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:10 compute-0 nova_compute[187243]: 2025-12-03 00:11:10.835 187247 DEBUG nova.virt.hardware [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:11:10 compute-0 nova_compute[187243]: 2025-12-03 00:11:10.837 187247 INFO nova.compute.claims [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:11:11 compute-0 podman[217013]: 2025-12-03 00:11:11.090525881 +0000 UTC m=+0.050611094 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 03 00:11:11 compute-0 nova_compute[187243]: 2025-12-03 00:11:11.898 187247 DEBUG nova.compute.provider_tree [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:11:12 compute-0 nova_compute[187243]: 2025-12-03 00:11:12.473 187247 DEBUG nova.scheduler.client.report [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:11:13 compute-0 nova_compute[187243]: 2025-12-03 00:11:13.117 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.289s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:13 compute-0 nova_compute[187243]: 2025-12-03 00:11:13.118 187247 DEBUG nova.compute.manager [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:11:13 compute-0 nova_compute[187243]: 2025-12-03 00:11:13.177 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:14 compute-0 nova_compute[187243]: 2025-12-03 00:11:14.453 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:18 compute-0 nova_compute[187243]: 2025-12-03 00:11:18.206 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:19 compute-0 nova_compute[187243]: 2025-12-03 00:11:19.242 187247 DEBUG nova.compute.manager [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:11:19 compute-0 nova_compute[187243]: 2025-12-03 00:11:19.242 187247 DEBUG nova.network.neutron [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:11:19 compute-0 nova_compute[187243]: 2025-12-03 00:11:19.243 187247 WARNING neutronclient.v2_0.client [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:11:19 compute-0 nova_compute[187243]: 2025-12-03 00:11:19.243 187247 WARNING neutronclient.v2_0.client [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:11:19 compute-0 nova_compute[187243]: 2025-12-03 00:11:19.456 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:19 compute-0 nova_compute[187243]: 2025-12-03 00:11:19.773 187247 INFO nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:11:20 compute-0 nova_compute[187243]: 2025-12-03 00:11:20.104 187247 DEBUG nova.network.neutron [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Successfully created port: f36e34c0-cc70-4a73-b904-d40c504fefa3 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:11:20 compute-0 nova_compute[187243]: 2025-12-03 00:11:20.280 187247 DEBUG nova.compute.manager [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:11:21 compute-0 podman[217035]: 2025-12-03 00:11:21.095743931 +0000 UTC m=+0.053449683 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.257 187247 DEBUG nova.compute.manager [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.258 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.259 187247 INFO nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Creating image(s)
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.259 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.259 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.260 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.260 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.263 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.265 187247 DEBUG oslo_concurrency.processutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.299 187247 DEBUG nova.network.neutron [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Successfully updated port: f36e34c0-cc70-4a73-b904-d40c504fefa3 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.316 187247 DEBUG oslo_concurrency.processutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.317 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.318 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.319 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.323 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.323 187247 DEBUG oslo_concurrency.processutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.352 187247 DEBUG nova.compute.manager [req-97593fbf-22e1-436e-ae16-cd67117ee188 req-7e90bea0-b83d-4c21-9bf3-a588a00fd896 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-changed-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.353 187247 DEBUG nova.compute.manager [req-97593fbf-22e1-436e-ae16-cd67117ee188 req-7e90bea0-b83d-4c21-9bf3-a588a00fd896 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Refreshing instance network info cache due to event network-changed-f36e34c0-cc70-4a73-b904-d40c504fefa3. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.354 187247 DEBUG oslo_concurrency.lockutils [req-97593fbf-22e1-436e-ae16-cd67117ee188 req-7e90bea0-b83d-4c21-9bf3-a588a00fd896 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.354 187247 DEBUG oslo_concurrency.lockutils [req-97593fbf-22e1-436e-ae16-cd67117ee188 req-7e90bea0-b83d-4c21-9bf3-a588a00fd896 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.354 187247 DEBUG nova.network.neutron [req-97593fbf-22e1-436e-ae16-cd67117ee188 req-7e90bea0-b83d-4c21-9bf3-a588a00fd896 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Refreshing network info cache for port f36e34c0-cc70-4a73-b904-d40c504fefa3 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.374 187247 DEBUG oslo_concurrency.processutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.374 187247 DEBUG oslo_concurrency.processutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.410 187247 DEBUG oslo_concurrency.processutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.410 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.092s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.411 187247 DEBUG oslo_concurrency.processutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.475 187247 DEBUG oslo_concurrency.processutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.476 187247 DEBUG nova.virt.disk.api [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Checking if we can resize image /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.476 187247 DEBUG oslo_concurrency.processutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.527 187247 DEBUG oslo_concurrency.processutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.528 187247 DEBUG nova.virt.disk.api [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Cannot resize image /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.528 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.528 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Ensure instance console log exists: /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.529 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.529 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.529 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.808 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:11:22 compute-0 nova_compute[187243]: 2025-12-03 00:11:22.862 187247 WARNING neutronclient.v2_0.client [req-97593fbf-22e1-436e-ae16-cd67117ee188 req-7e90bea0-b83d-4c21-9bf3-a588a00fd896 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:11:23 compute-0 nova_compute[187243]: 2025-12-03 00:11:23.208 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:23 compute-0 nova_compute[187243]: 2025-12-03 00:11:23.610 187247 DEBUG nova.network.neutron [req-97593fbf-22e1-436e-ae16-cd67117ee188 req-7e90bea0-b83d-4c21-9bf3-a588a00fd896 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:11:23 compute-0 nova_compute[187243]: 2025-12-03 00:11:23.802 187247 DEBUG nova.network.neutron [req-97593fbf-22e1-436e-ae16-cd67117ee188 req-7e90bea0-b83d-4c21-9bf3-a588a00fd896 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:11:24 compute-0 podman[217074]: 2025-12-03 00:11:24.118436744 +0000 UTC m=+0.065241981 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 03 00:11:24 compute-0 podman[217075]: 2025-12-03 00:11:24.127854453 +0000 UTC m=+0.081576578 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Dec 03 00:11:24 compute-0 nova_compute[187243]: 2025-12-03 00:11:24.458 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:26 compute-0 sshd-session[217120]: Invalid user valheim from 102.210.148.92 port 32890
Dec 03 00:11:27 compute-0 sshd-session[217120]: Received disconnect from 102.210.148.92 port 32890:11: Bye Bye [preauth]
Dec 03 00:11:27 compute-0 sshd-session[217120]: Disconnected from invalid user valheim 102.210.148.92 port 32890 [preauth]
Dec 03 00:11:27 compute-0 nova_compute[187243]: 2025-12-03 00:11:27.191 187247 DEBUG oslo_concurrency.lockutils [req-97593fbf-22e1-436e-ae16-cd67117ee188 req-7e90bea0-b83d-4c21-9bf3-a588a00fd896 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:11:27 compute-0 nova_compute[187243]: 2025-12-03 00:11:27.191 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquired lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:11:27 compute-0 nova_compute[187243]: 2025-12-03 00:11:27.191 187247 DEBUG nova.network.neutron [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:11:28 compute-0 nova_compute[187243]: 2025-12-03 00:11:28.210 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:28 compute-0 nova_compute[187243]: 2025-12-03 00:11:28.610 187247 DEBUG nova.network.neutron [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:11:28 compute-0 nova_compute[187243]: 2025-12-03 00:11:28.845 187247 WARNING neutronclient.v2_0.client [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:11:28 compute-0 nova_compute[187243]: 2025-12-03 00:11:28.998 187247 DEBUG nova.network.neutron [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Updating instance_info_cache with network_info: [{"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.460 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.575 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Releasing lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.576 187247 DEBUG nova.compute.manager [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Instance network_info: |[{"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.578 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Start _get_guest_xml network_info=[{"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.581 187247 WARNING nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.582 187247 DEBUG nova.virt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1319059932', uuid='eb1b85fe-471a-46bd-9929-c377144cb8eb'), owner=OwnerMeta(userid='6048ff4ab0aa45689a23ca16a6558b9d', username='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin', projectid='8618acb8fd774a27ac00f4e0f10b934c', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720689.5824358) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.591 187247 DEBUG nova.virt.libvirt.host [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.592 187247 DEBUG nova.virt.libvirt.host [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.594 187247 DEBUG nova.virt.libvirt.host [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.595 187247 DEBUG nova.virt.libvirt.host [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.596 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.596 187247 DEBUG nova.virt.hardware [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.596 187247 DEBUG nova.virt.hardware [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.597 187247 DEBUG nova.virt.hardware [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.597 187247 DEBUG nova.virt.hardware [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.597 187247 DEBUG nova.virt.hardware [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.597 187247 DEBUG nova.virt.hardware [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.597 187247 DEBUG nova.virt.hardware [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.598 187247 DEBUG nova.virt.hardware [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.598 187247 DEBUG nova.virt.hardware [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.598 187247 DEBUG nova.virt.hardware [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.598 187247 DEBUG nova.virt.hardware [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.601 187247 DEBUG nova.virt.libvirt.vif [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:11:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1319059932',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1319059932',id=20,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8618acb8fd774a27ac00f4e0f10b934c',ramdisk_id='',reservation_id='r-0nyztijh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:11:20Z,user_data=None,user_id='6048ff4ab0aa45689a23ca16a6558b9d',uuid=eb1b85fe-471a-46bd-9929-c377144cb8eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.601 187247 DEBUG nova.network.os_vif_util [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Converting VIF {"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.602 187247 DEBUG nova.network.os_vif_util [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:11:29 compute-0 nova_compute[187243]: 2025-12-03 00:11:29.602 187247 DEBUG nova.objects.instance [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lazy-loading 'pci_devices' on Instance uuid eb1b85fe-471a-46bd-9929-c377144cb8eb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:11:29 compute-0 podman[197600]: time="2025-12-03T00:11:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:11:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:11:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:11:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:11:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.111 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:11:30 compute-0 nova_compute[187243]:   <uuid>eb1b85fe-471a-46bd-9929-c377144cb8eb</uuid>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   <name>instance-00000014</name>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1319059932</nova:name>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:11:29</nova:creationTime>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:11:30 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:11:30 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:user uuid="6048ff4ab0aa45689a23ca16a6558b9d">tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin</nova:user>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:project uuid="8618acb8fd774a27ac00f4e0f10b934c">tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013</nova:project>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         <nova:port uuid="f36e34c0-cc70-4a73-b904-d40c504fefa3">
Dec 03 00:11:30 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <system>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <entry name="serial">eb1b85fe-471a-46bd-9929-c377144cb8eb</entry>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <entry name="uuid">eb1b85fe-471a-46bd-9929-c377144cb8eb</entry>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     </system>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   <os>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   </os>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   <features>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   </features>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk.config"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:ee:f1:9d"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <target dev="tapf36e34c0-cc"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/console.log" append="off"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <video>
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     </video>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:11:30 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:11:30 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:11:30 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:11:30 compute-0 nova_compute[187243]: </domain>
Dec 03 00:11:30 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.113 187247 DEBUG nova.compute.manager [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Preparing to wait for external event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.113 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.113 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.113 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.114 187247 DEBUG nova.virt.libvirt.vif [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:11:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1319059932',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1319059932',id=20,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8618acb8fd774a27ac00f4e0f10b934c',ramdisk_id='',reservation_id='r-0nyztijh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:11:20Z,user_data=None,user_id='6048ff4ab0aa45689a23ca16a6558b9d',uuid=eb1b85fe-471a-46bd-9929-c377144cb8eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.115 187247 DEBUG nova.network.os_vif_util [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Converting VIF {"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.115 187247 DEBUG nova.network.os_vif_util [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.115 187247 DEBUG os_vif [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.116 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.116 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.117 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.117 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.118 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '72379746-54a5-5b3c-8e2b-3b5fd7f8917b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.118 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.120 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.122 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.122 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf36e34c0-cc, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.122 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf36e34c0-cc, col_values=(('qos', UUID('0d2552a7-ddb0-46b3-8339-64cacd3de002')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.122 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf36e34c0-cc, col_values=(('external_ids', {'iface-id': 'f36e34c0-cc70-4a73-b904-d40c504fefa3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:f1:9d', 'vm-uuid': 'eb1b85fe-471a-46bd-9929-c377144cb8eb'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.123 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:30 compute-0 NetworkManager[55671]: <info>  [1764720690.1240] manager: (tapf36e34c0-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.125 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.129 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:30 compute-0 nova_compute[187243]: 2025-12-03 00:11:30.129 187247 INFO os_vif [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc')
Dec 03 00:11:31 compute-0 openstack_network_exporter[199746]: ERROR   00:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:11:31 compute-0 openstack_network_exporter[199746]: ERROR   00:11:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:11:31 compute-0 openstack_network_exporter[199746]: ERROR   00:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:11:31 compute-0 openstack_network_exporter[199746]: ERROR   00:11:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:11:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:11:31 compute-0 openstack_network_exporter[199746]: ERROR   00:11:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:11:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:11:31 compute-0 nova_compute[187243]: 2025-12-03 00:11:31.770 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:11:31 compute-0 nova_compute[187243]: 2025-12-03 00:11:31.770 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:11:31 compute-0 nova_compute[187243]: 2025-12-03 00:11:31.771 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] No VIF found with MAC fa:16:3e:ee:f1:9d, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:11:31 compute-0 nova_compute[187243]: 2025-12-03 00:11:31.771 187247 INFO nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Using config drive
Dec 03 00:11:32 compute-0 nova_compute[187243]: 2025-12-03 00:11:32.532 187247 WARNING neutronclient.v2_0.client [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:11:32 compute-0 nova_compute[187243]: 2025-12-03 00:11:32.676 187247 INFO nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Creating config drive at /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk.config
Dec 03 00:11:32 compute-0 nova_compute[187243]: 2025-12-03 00:11:32.687 187247 DEBUG oslo_concurrency.processutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpk9mazote execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:32 compute-0 nova_compute[187243]: 2025-12-03 00:11:32.830 187247 DEBUG oslo_concurrency.processutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpk9mazote" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:32 compute-0 kernel: tapf36e34c0-cc: entered promiscuous mode
Dec 03 00:11:32 compute-0 ovn_controller[95488]: 2025-12-03T00:11:32Z|00140|binding|INFO|Claiming lport f36e34c0-cc70-4a73-b904-d40c504fefa3 for this chassis.
Dec 03 00:11:32 compute-0 ovn_controller[95488]: 2025-12-03T00:11:32Z|00141|binding|INFO|f36e34c0-cc70-4a73-b904-d40c504fefa3: Claiming fa:16:3e:ee:f1:9d 10.100.0.13
Dec 03 00:11:32 compute-0 NetworkManager[55671]: <info>  [1764720692.9206] manager: (tapf36e34c0-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Dec 03 00:11:32 compute-0 nova_compute[187243]: 2025-12-03 00:11:32.919 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:32 compute-0 nova_compute[187243]: 2025-12-03 00:11:32.922 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:32.935 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:f1:9d 10.100.0.13'], port_security=['fa:16:3e:ee:f1:9d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'eb1b85fe-471a-46bd-9929-c377144cb8eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8618acb8fd774a27ac00f4e0f10b934c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1aec6d5f-c8c6-4b74-ad3d-5af55712b2e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8f4f3fe-a4b0-48ff-9b01-d63a8cee7576, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=f36e34c0-cc70-4a73-b904-d40c504fefa3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:11:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:32.937 104379 INFO neutron.agent.ovn.metadata.agent [-] Port f36e34c0-cc70-4a73-b904-d40c504fefa3 in datapath 44651134-dca8-45c2-963a-1f17aac67593 bound to our chassis
Dec 03 00:11:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:32.939 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 44651134-dca8-45c2-963a-1f17aac67593
Dec 03 00:11:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:32.964 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6f3cae8f-0452-4626-b6a8-90d159eee4de]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:32.966 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap44651134-d1 in ovnmeta-44651134-dca8-45c2-963a-1f17aac67593 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:11:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:32.968 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap44651134-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:11:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:32.968 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[810b223c-14fc-47c7-947f-d1695d28891c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:32.970 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3009fd78-34df-4c32-98f3-3286ecbb6c1c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:32 compute-0 systemd-machined[153518]: New machine qemu-12-instance-00000014.
Dec 03 00:11:32 compute-0 systemd-udevd[217143]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:11:32 compute-0 nova_compute[187243]: 2025-12-03 00:11:32.978 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:32 compute-0 ovn_controller[95488]: 2025-12-03T00:11:32Z|00142|binding|INFO|Setting lport f36e34c0-cc70-4a73-b904-d40c504fefa3 ovn-installed in OVS
Dec 03 00:11:32 compute-0 ovn_controller[95488]: 2025-12-03T00:11:32Z|00143|binding|INFO|Setting lport f36e34c0-cc70-4a73-b904-d40c504fefa3 up in Southbound
Dec 03 00:11:32 compute-0 nova_compute[187243]: 2025-12-03 00:11:32.984 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:32 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000014.
Dec 03 00:11:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:32.991 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[c00b364e-2d4b-4649-9048-a9a6b82b01f0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 NetworkManager[55671]: <info>  [1764720693.0022] device (tapf36e34c0-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:11:33 compute-0 NetworkManager[55671]: <info>  [1764720693.0036] device (tapf36e34c0-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.009 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8e956efa-be48-430f-9fc8-4874dc407d6c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.038 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[e145a16d-59cc-4264-8ee1-027bd3a397bf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 NetworkManager[55671]: <info>  [1764720693.0440] manager: (tap44651134-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/62)
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.043 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[96ace34d-da65-4eda-b41c-3ebd2117ed4f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.077 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[a62cc2e3-9bf5-40a5-b244-87605edf7724]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.080 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[c47a6a60-b383-49fa-9492-577005305d24]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 NetworkManager[55671]: <info>  [1764720693.1031] device (tap44651134-d0): carrier: link connected
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.109 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[3aa6fe4b-b69d-49d4-8af8-1954a22a1a36]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.124 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6264ea9b-117f-4371-a61f-c57248154f87]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44651134-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:a0:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478374, 'reachable_time': 37893, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217175, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.140 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d8ed35d8-6744-4a56-bae2-7f876f11671b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe63:a0ba'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 478374, 'tstamp': 478374}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217176, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.154 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[142ab802-4479-4b80-8e62-3ad5636cb7e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap44651134-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:a0:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478374, 'reachable_time': 37893, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217177, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.165 187247 DEBUG nova.compute.manager [req-597cbfeb-913f-4294-aab2-675ab779df1b req-9e9169b3-6353-436f-b28b-92edf3247fa7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.165 187247 DEBUG oslo_concurrency.lockutils [req-597cbfeb-913f-4294-aab2-675ab779df1b req-9e9169b3-6353-436f-b28b-92edf3247fa7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.165 187247 DEBUG oslo_concurrency.lockutils [req-597cbfeb-913f-4294-aab2-675ab779df1b req-9e9169b3-6353-436f-b28b-92edf3247fa7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.166 187247 DEBUG oslo_concurrency.lockutils [req-597cbfeb-913f-4294-aab2-675ab779df1b req-9e9169b3-6353-436f-b28b-92edf3247fa7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.166 187247 DEBUG nova.compute.manager [req-597cbfeb-913f-4294-aab2-675ab779df1b req-9e9169b3-6353-436f-b28b-92edf3247fa7 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Processing event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.182 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[bae0d17f-1b4d-479f-be03-f74f98ffb731]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.210 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.237 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e92e79-a73c-4dec-8400-03af83b22ce5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.238 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44651134-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.238 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.238 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44651134-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.239 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:33 compute-0 kernel: tap44651134-d0: entered promiscuous mode
Dec 03 00:11:33 compute-0 NetworkManager[55671]: <info>  [1764720693.2404] manager: (tap44651134-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.241 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap44651134-d0, col_values=(('external_ids', {'iface-id': 'c9c5ad1f-d82d-4b26-aa35-4c2bd8e4a10c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:11:33 compute-0 ovn_controller[95488]: 2025-12-03T00:11:33Z|00144|binding|INFO|Releasing lport c9c5ad1f-d82d-4b26-aa35-4c2bd8e4a10c from this chassis (sb_readonly=0)
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.243 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.244 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[74dc758b-2927-449a-b69c-8a1b4802834b]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.244 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.245 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.245 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 44651134-dca8-45c2-963a-1f17aac67593 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.245 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.245 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb1e1fe-9972-4774-9a57-728a882b0c74]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.246 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.246 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[017eac0e-a150-4a7e-8874-1f79baed5502]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.247 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: global
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-44651134-dca8-45c2-963a-1f17aac67593
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: defaults
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     log global
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID 44651134-dca8-45c2-963a-1f17aac67593
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:11:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:33.247 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'env', 'PROCESS_TAG=haproxy-44651134-dca8-45c2-963a-1f17aac67593', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/44651134-dca8-45c2-963a-1f17aac67593.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.254 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.453 187247 DEBUG nova.compute.manager [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.456 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.459 187247 INFO nova.virt.libvirt.driver [-] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Instance spawned successfully.
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.460 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:11:33 compute-0 podman[217217]: 2025-12-03 00:11:33.634789024 +0000 UTC m=+0.058636660 container create 2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 03 00:11:33 compute-0 systemd[1]: Started libpod-conmon-2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99.scope.
Dec 03 00:11:33 compute-0 podman[217217]: 2025-12-03 00:11:33.6058771 +0000 UTC m=+0.029724756 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:11:33 compute-0 systemd[1]: Started libcrun container.
Dec 03 00:11:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca6e31c1fdfd30b84a444aa4cc1d81ee451f9f44871f6000da0a6d8eb5c1e32d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:11:33 compute-0 podman[217217]: 2025-12-03 00:11:33.732948855 +0000 UTC m=+0.156796541 container init 2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Dec 03 00:11:33 compute-0 podman[217217]: 2025-12-03 00:11:33.737941437 +0000 UTC m=+0.161789083 container start 2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:11:33 compute-0 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[217232]: [NOTICE]   (217236) : New worker (217238) forked
Dec 03 00:11:33 compute-0 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[217232]: [NOTICE]   (217236) : Loading success.
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.982 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.982 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.983 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.983 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.983 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:11:33 compute-0 nova_compute[187243]: 2025-12-03 00:11:33.984 187247 DEBUG nova.virt.libvirt.driver [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:11:34 compute-0 nova_compute[187243]: 2025-12-03 00:11:34.494 187247 INFO nova.compute.manager [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Took 12.24 seconds to spawn the instance on the hypervisor.
Dec 03 00:11:34 compute-0 nova_compute[187243]: 2025-12-03 00:11:34.495 187247 DEBUG nova.compute.manager [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:11:35 compute-0 nova_compute[187243]: 2025-12-03 00:11:35.037 187247 INFO nova.compute.manager [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Took 24.25 seconds to build instance.
Dec 03 00:11:35 compute-0 nova_compute[187243]: 2025-12-03 00:11:35.123 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:35 compute-0 nova_compute[187243]: 2025-12-03 00:11:35.249 187247 DEBUG nova.compute.manager [req-47337783-ef94-45d2-b088-a96c7d299c47 req-c30a4296-a2ca-4e5f-b4e9-e5146bcf6d0d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:11:35 compute-0 nova_compute[187243]: 2025-12-03 00:11:35.249 187247 DEBUG oslo_concurrency.lockutils [req-47337783-ef94-45d2-b088-a96c7d299c47 req-c30a4296-a2ca-4e5f-b4e9-e5146bcf6d0d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:35 compute-0 nova_compute[187243]: 2025-12-03 00:11:35.250 187247 DEBUG oslo_concurrency.lockutils [req-47337783-ef94-45d2-b088-a96c7d299c47 req-c30a4296-a2ca-4e5f-b4e9-e5146bcf6d0d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:35 compute-0 nova_compute[187243]: 2025-12-03 00:11:35.250 187247 DEBUG oslo_concurrency.lockutils [req-47337783-ef94-45d2-b088-a96c7d299c47 req-c30a4296-a2ca-4e5f-b4e9-e5146bcf6d0d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:35 compute-0 nova_compute[187243]: 2025-12-03 00:11:35.250 187247 DEBUG nova.compute.manager [req-47337783-ef94-45d2-b088-a96c7d299c47 req-c30a4296-a2ca-4e5f-b4e9-e5146bcf6d0d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] No waiting events found dispatching network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:11:35 compute-0 nova_compute[187243]: 2025-12-03 00:11:35.251 187247 WARNING nova.compute.manager [req-47337783-ef94-45d2-b088-a96c7d299c47 req-c30a4296-a2ca-4e5f-b4e9-e5146bcf6d0d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received unexpected event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 for instance with vm_state active and task_state None.
Dec 03 00:11:35 compute-0 nova_compute[187243]: 2025-12-03 00:11:35.543 187247 DEBUG oslo_concurrency.lockutils [None req-40304720-b953-40e1-a20f-6d30e3c6fa5d 6048ff4ab0aa45689a23ca16a6558b9d 8618acb8fd774a27ac00f4e0f10b934c - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 25.766s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:38 compute-0 nova_compute[187243]: 2025-12-03 00:11:38.212 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:39 compute-0 podman[217249]: 2025-12-03 00:11:39.117620846 +0000 UTC m=+0.070610511 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:11:40 compute-0 nova_compute[187243]: 2025-12-03 00:11:40.126 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:40 compute-0 sshd-session[217247]: Invalid user username from 45.78.222.160 port 59818
Dec 03 00:11:40 compute-0 nova_compute[187243]: 2025-12-03 00:11:40.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:41 compute-0 nova_compute[187243]: 2025-12-03 00:11:41.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:41 compute-0 sshd-session[217247]: Received disconnect from 45.78.222.160 port 59818:11: Bye Bye [preauth]
Dec 03 00:11:41 compute-0 sshd-session[217247]: Disconnected from invalid user username 45.78.222.160 port 59818 [preauth]
Dec 03 00:11:42 compute-0 podman[217270]: 2025-12-03 00:11:42.112515772 +0000 UTC m=+0.066387638 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=multipathd, config_id=multipathd)
Dec 03 00:11:42 compute-0 nova_compute[187243]: 2025-12-03 00:11:42.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:42 compute-0 nova_compute[187243]: 2025-12-03 00:11:42.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:11:43 compute-0 nova_compute[187243]: 2025-12-03 00:11:43.214 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:45 compute-0 nova_compute[187243]: 2025-12-03 00:11:45.127 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:46 compute-0 nova_compute[187243]: 2025-12-03 00:11:46.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:46 compute-0 nova_compute[187243]: 2025-12-03 00:11:46.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:46 compute-0 ovn_controller[95488]: 2025-12-03T00:11:46Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ee:f1:9d 10.100.0.13
Dec 03 00:11:46 compute-0 ovn_controller[95488]: 2025-12-03T00:11:46Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ee:f1:9d 10.100.0.13
Dec 03 00:11:48 compute-0 nova_compute[187243]: 2025-12-03 00:11:48.216 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:48 compute-0 nova_compute[187243]: 2025-12-03 00:11:48.636 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:48 compute-0 nova_compute[187243]: 2025-12-03 00:11:48.636 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:48 compute-0 nova_compute[187243]: 2025-12-03 00:11:48.636 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:48 compute-0 nova_compute[187243]: 2025-12-03 00:11:48.637 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:11:50 compute-0 nova_compute[187243]: 2025-12-03 00:11:50.130 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:50 compute-0 nova_compute[187243]: 2025-12-03 00:11:50.644 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:50 compute-0 nova_compute[187243]: 2025-12-03 00:11:50.715 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:50 compute-0 nova_compute[187243]: 2025-12-03 00:11:50.716 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:50 compute-0 nova_compute[187243]: 2025-12-03 00:11:50.767 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:50 compute-0 nova_compute[187243]: 2025-12-03 00:11:50.886 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:11:50 compute-0 nova_compute[187243]: 2025-12-03 00:11:50.887 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:11:50 compute-0 nova_compute[187243]: 2025-12-03 00:11:50.903 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:11:50 compute-0 nova_compute[187243]: 2025-12-03 00:11:50.904 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5650MB free_disk=73.13346481323242GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:11:50 compute-0 nova_compute[187243]: 2025-12-03 00:11:50.904 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:11:50 compute-0 nova_compute[187243]: 2025-12-03 00:11:50.904 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:11:51 compute-0 nova_compute[187243]: 2025-12-03 00:11:51.944 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance eb1b85fe-471a-46bd-9929-c377144cb8eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:11:51 compute-0 nova_compute[187243]: 2025-12-03 00:11:51.945 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:11:51 compute-0 nova_compute[187243]: 2025-12-03 00:11:51.945 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:11:50 up  1:20,  0 user,  load average: 0.22, 0.19, 0.27\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_8618acb8fd774a27ac00f4e0f10b934c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:11:51 compute-0 nova_compute[187243]: 2025-12-03 00:11:51.961 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing inventories for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:11:51 compute-0 nova_compute[187243]: 2025-12-03 00:11:51.979 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating ProviderTree inventory for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:11:51 compute-0 nova_compute[187243]: 2025-12-03 00:11:51.979 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:11:51 compute-0 nova_compute[187243]: 2025-12-03 00:11:51.990 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing aggregate associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:11:52 compute-0 nova_compute[187243]: 2025-12-03 00:11:52.007 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing trait associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_SOUND_MODEL_AC97,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOUND_MODEL_ICH9,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:11:52 compute-0 nova_compute[187243]: 2025-12-03 00:11:52.044 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:11:52 compute-0 podman[217313]: 2025-12-03 00:11:52.146182126 +0000 UTC m=+0.097167518 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:11:52 compute-0 nova_compute[187243]: 2025-12-03 00:11:52.550 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:11:53 compute-0 nova_compute[187243]: 2025-12-03 00:11:53.059 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:11:53 compute-0 nova_compute[187243]: 2025-12-03 00:11:53.059 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.155s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:11:53 compute-0 nova_compute[187243]: 2025-12-03 00:11:53.261 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:55 compute-0 nova_compute[187243]: 2025-12-03 00:11:55.055 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:55 compute-0 nova_compute[187243]: 2025-12-03 00:11:55.056 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:55 compute-0 nova_compute[187243]: 2025-12-03 00:11:55.056 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:11:55 compute-0 podman[217338]: 2025-12-03 00:11:55.088456849 +0000 UTC m=+0.045350205 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 03 00:11:55 compute-0 nova_compute[187243]: 2025-12-03 00:11:55.131 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:55 compute-0 podman[217339]: 2025-12-03 00:11:55.133068086 +0000 UTC m=+0.082110661 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Dec 03 00:11:58 compute-0 nova_compute[187243]: 2025-12-03 00:11:58.264 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:58 compute-0 sshd-session[217382]: Invalid user dolphinscheduler from 20.123.120.169 port 56722
Dec 03 00:11:58 compute-0 sshd-session[217382]: Received disconnect from 20.123.120.169 port 56722:11: Bye Bye [preauth]
Dec 03 00:11:58 compute-0 sshd-session[217382]: Disconnected from invalid user dolphinscheduler 20.123.120.169 port 56722 [preauth]
Dec 03 00:11:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:58.821 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:11:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:11:58.821 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:11:58 compute-0 nova_compute[187243]: 2025-12-03 00:11:58.822 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:11:59 compute-0 podman[197600]: time="2025-12-03T00:11:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:11:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:11:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:11:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:11:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3062 "" "Go-http-client/1.1"
Dec 03 00:12:00 compute-0 nova_compute[187243]: 2025-12-03 00:12:00.134 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:00.704 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:00.704 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:00.704 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:01 compute-0 openstack_network_exporter[199746]: ERROR   00:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:12:01 compute-0 openstack_network_exporter[199746]: ERROR   00:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:12:01 compute-0 openstack_network_exporter[199746]: ERROR   00:12:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:12:01 compute-0 openstack_network_exporter[199746]: ERROR   00:12:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:12:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:12:01 compute-0 openstack_network_exporter[199746]: ERROR   00:12:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:12:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:12:03 compute-0 nova_compute[187243]: 2025-12-03 00:12:03.294 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:05 compute-0 nova_compute[187243]: 2025-12-03 00:12:05.136 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:05 compute-0 sshd-session[217388]: Invalid user minecraft from 49.247.36.49 port 17902
Dec 03 00:12:05 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:05.823 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:06 compute-0 sshd-session[217388]: Received disconnect from 49.247.36.49 port 17902:11: Bye Bye [preauth]
Dec 03 00:12:06 compute-0 sshd-session[217388]: Disconnected from invalid user minecraft 49.247.36.49 port 17902 [preauth]
Dec 03 00:12:08 compute-0 nova_compute[187243]: 2025-12-03 00:12:08.312 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:10 compute-0 podman[217390]: 2025-12-03 00:12:10.098653628 +0000 UTC m=+0.056672491 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:12:10 compute-0 nova_compute[187243]: 2025-12-03 00:12:10.137 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:12 compute-0 sshd-session[217386]: Received disconnect from 45.78.219.213 port 49166:11: Bye Bye [preauth]
Dec 03 00:12:12 compute-0 sshd-session[217386]: Disconnected from authenticating user root 45.78.219.213 port 49166 [preauth]
Dec 03 00:12:13 compute-0 podman[217411]: 2025-12-03 00:12:13.092485219 +0000 UTC m=+0.050313577 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:12:13 compute-0 nova_compute[187243]: 2025-12-03 00:12:13.355 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:15 compute-0 nova_compute[187243]: 2025-12-03 00:12:15.139 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:16 compute-0 sshd-session[217431]: Invalid user dolphinscheduler from 23.95.37.90 port 57888
Dec 03 00:12:16 compute-0 sshd-session[217431]: Received disconnect from 23.95.37.90 port 57888:11: Bye Bye [preauth]
Dec 03 00:12:16 compute-0 sshd-session[217431]: Disconnected from invalid user dolphinscheduler 23.95.37.90 port 57888 [preauth]
Dec 03 00:12:18 compute-0 nova_compute[187243]: 2025-12-03 00:12:18.358 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:20 compute-0 nova_compute[187243]: 2025-12-03 00:12:20.140 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:20 compute-0 sshd-session[217433]: Invalid user dangulo from 45.78.219.95 port 51714
Dec 03 00:12:21 compute-0 sshd-session[217433]: Received disconnect from 45.78.219.95 port 51714:11: Bye Bye [preauth]
Dec 03 00:12:21 compute-0 sshd-session[217433]: Disconnected from invalid user dangulo 45.78.219.95 port 51714 [preauth]
Dec 03 00:12:22 compute-0 nova_compute[187243]: 2025-12-03 00:12:22.080 187247 DEBUG nova.compute.manager [None req-19facadd-14bc-4891-a8f6-697ec4bb74d8 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Dec 03 00:12:22 compute-0 nova_compute[187243]: 2025-12-03 00:12:22.122 187247 DEBUG nova.compute.provider_tree [None req-19facadd-14bc-4891-a8f6-697ec4bb74d8 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Updating resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 generation from 35 to 36 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 03 00:12:23 compute-0 podman[217435]: 2025-12-03 00:12:23.098647925 +0000 UTC m=+0.053105666 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:12:23 compute-0 nova_compute[187243]: 2025-12-03 00:12:23.389 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:25 compute-0 nova_compute[187243]: 2025-12-03 00:12:25.141 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:26 compute-0 podman[217459]: 2025-12-03 00:12:26.090331648 +0000 UTC m=+0.051307472 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Dec 03 00:12:26 compute-0 podman[217460]: 2025-12-03 00:12:26.146721224 +0000 UTC m=+0.105306769 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller)
Dec 03 00:12:27 compute-0 ovn_controller[95488]: 2025-12-03T00:12:27Z|00145|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 03 00:12:28 compute-0 nova_compute[187243]: 2025-12-03 00:12:28.391 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:28 compute-0 nova_compute[187243]: 2025-12-03 00:12:28.437 187247 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Check if temp file /var/lib/nova/instances/tmphtz5c8zp exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:12:28 compute-0 nova_compute[187243]: 2025-12-03 00:12:28.441 187247 DEBUG nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtz5c8zp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb1b85fe-471a-46bd-9929-c377144cb8eb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:12:29 compute-0 podman[197600]: time="2025-12-03T00:12:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:12:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:12:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:12:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:12:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3066 "" "Go-http-client/1.1"
Dec 03 00:12:30 compute-0 nova_compute[187243]: 2025-12-03 00:12:30.144 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:31 compute-0 openstack_network_exporter[199746]: ERROR   00:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:12:31 compute-0 openstack_network_exporter[199746]: ERROR   00:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:12:31 compute-0 openstack_network_exporter[199746]: ERROR   00:12:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:12:31 compute-0 openstack_network_exporter[199746]: ERROR   00:12:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:12:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:12:31 compute-0 openstack_network_exporter[199746]: ERROR   00:12:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:12:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:12:31 compute-0 sshd-session[217501]: Invalid user valheim from 61.220.235.10 port 58966
Dec 03 00:12:32 compute-0 sshd-session[217501]: Received disconnect from 61.220.235.10 port 58966:11: Bye Bye [preauth]
Dec 03 00:12:32 compute-0 sshd-session[217501]: Disconnected from invalid user valheim 61.220.235.10 port 58966 [preauth]
Dec 03 00:12:33 compute-0 nova_compute[187243]: 2025-12-03 00:12:33.141 187247 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:33 compute-0 nova_compute[187243]: 2025-12-03 00:12:33.192 187247 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:33 compute-0 nova_compute[187243]: 2025-12-03 00:12:33.192 187247 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:33 compute-0 nova_compute[187243]: 2025-12-03 00:12:33.246 187247 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:33 compute-0 nova_compute[187243]: 2025-12-03 00:12:33.248 187247 DEBUG nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Preparing to wait for external event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:12:33 compute-0 nova_compute[187243]: 2025-12-03 00:12:33.248 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:33 compute-0 nova_compute[187243]: 2025-12-03 00:12:33.249 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:33 compute-0 nova_compute[187243]: 2025-12-03 00:12:33.249 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:33 compute-0 nova_compute[187243]: 2025-12-03 00:12:33.392 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:35 compute-0 nova_compute[187243]: 2025-12-03 00:12:35.145 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:38 compute-0 nova_compute[187243]: 2025-12-03 00:12:38.435 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:39 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:39.128 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:12:39 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:39.129 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:12:39 compute-0 nova_compute[187243]: 2025-12-03 00:12:39.129 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:39 compute-0 nova_compute[187243]: 2025-12-03 00:12:39.134 187247 DEBUG nova.compute.manager [req-d98b8c68-c40e-4142-9b93-b221edb5577e req-23588354-40d5-47c5-9f3b-69383ee38f2e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:12:39 compute-0 nova_compute[187243]: 2025-12-03 00:12:39.135 187247 DEBUG oslo_concurrency.lockutils [req-d98b8c68-c40e-4142-9b93-b221edb5577e req-23588354-40d5-47c5-9f3b-69383ee38f2e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:39 compute-0 nova_compute[187243]: 2025-12-03 00:12:39.135 187247 DEBUG oslo_concurrency.lockutils [req-d98b8c68-c40e-4142-9b93-b221edb5577e req-23588354-40d5-47c5-9f3b-69383ee38f2e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:39 compute-0 nova_compute[187243]: 2025-12-03 00:12:39.135 187247 DEBUG oslo_concurrency.lockutils [req-d98b8c68-c40e-4142-9b93-b221edb5577e req-23588354-40d5-47c5-9f3b-69383ee38f2e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:39 compute-0 nova_compute[187243]: 2025-12-03 00:12:39.135 187247 DEBUG nova.compute.manager [req-d98b8c68-c40e-4142-9b93-b221edb5577e req-23588354-40d5-47c5-9f3b-69383ee38f2e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] No event matching network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 in dict_keys([('network-vif-plugged', 'f36e34c0-cc70-4a73-b904-d40c504fefa3')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:12:39 compute-0 nova_compute[187243]: 2025-12-03 00:12:39.136 187247 DEBUG nova.compute.manager [req-d98b8c68-c40e-4142-9b93-b221edb5577e req-23588354-40d5-47c5-9f3b-69383ee38f2e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:12:40 compute-0 nova_compute[187243]: 2025-12-03 00:12:40.147 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:40 compute-0 sshd-session[217509]: Received disconnect from 102.210.148.92 port 34526:11: Bye Bye [preauth]
Dec 03 00:12:40 compute-0 sshd-session[217509]: Disconnected from authenticating user root 102.210.148.92 port 34526 [preauth]
Dec 03 00:12:40 compute-0 nova_compute[187243]: 2025-12-03 00:12:40.770 187247 INFO nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Took 7.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 03 00:12:41 compute-0 podman[217512]: 2025-12-03 00:12:41.0925318 +0000 UTC m=+0.050920302 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.189 187247 DEBUG nova.compute.manager [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.189 187247 DEBUG oslo_concurrency.lockutils [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.189 187247 DEBUG oslo_concurrency.lockutils [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.190 187247 DEBUG oslo_concurrency.lockutils [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.190 187247 DEBUG nova.compute.manager [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Processing event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.190 187247 DEBUG nova.compute.manager [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-changed-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.190 187247 DEBUG nova.compute.manager [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Refreshing instance network info cache due to event network-changed-f36e34c0-cc70-4a73-b904-d40c504fefa3. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.190 187247 DEBUG oslo_concurrency.lockutils [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.190 187247 DEBUG oslo_concurrency.lockutils [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.190 187247 DEBUG nova.network.neutron [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Refreshing network info cache for port f36e34c0-cc70-4a73-b904-d40c504fefa3 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.192 187247 DEBUG nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.702 187247 WARNING neutronclient.v2_0.client [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:41 compute-0 nova_compute[187243]: 2025-12-03 00:12:41.706 187247 DEBUG nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtz5c8zp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='eb1b85fe-471a-46bd-9929-c377144cb8eb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(8e05e9e2-afa5-4a3a-a936-f7c5663fe52f),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.067 187247 WARNING neutronclient.v2_0.client [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.219 187247 DEBUG nova.objects.instance [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid eb1b85fe-471a-46bd-9929-c377144cb8eb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.220 187247 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.221 187247 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.221 187247 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.662 187247 DEBUG nova.network.neutron [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Updated VIF entry in instance network info cache for port f36e34c0-cc70-4a73-b904-d40c504fefa3. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.662 187247 DEBUG nova.network.neutron [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Updating instance_info_cache with network_info: [{"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.723 187247 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.723 187247 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.732 187247 DEBUG nova.virt.libvirt.vif [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:11:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1319059932',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1319059932',id=20,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:11:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8618acb8fd774a27ac00f4e0f10b934c',ramdisk_id='',reservation_id='r-0nyztijh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:11:34Z,user_data=None,user_id='6048ff4ab0aa45689a23ca16a6558b9d',uuid=eb1b85fe-471a-46bd-9929-c377144cb8eb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.733 187247 DEBUG nova.network.os_vif_util [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.734 187247 DEBUG nova.network.os_vif_util [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.734 187247 DEBUG nova.virt.libvirt.migration [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:ee:f1:9d"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <target dev="tapf36e34c0-cc"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]: </interface>
Dec 03 00:12:42 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.735 187247 DEBUG nova.virt.libvirt.migration [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <name>instance-00000014</name>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <uuid>eb1b85fe-471a-46bd-9929-c377144cb8eb</uuid>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1319059932</nova:name>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:11:29</nova:creationTime>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:12:42 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:12:42 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:user uuid="6048ff4ab0aa45689a23ca16a6558b9d">tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin</nova:user>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:project uuid="8618acb8fd774a27ac00f4e0f10b934c">tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013</nova:project>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:port uuid="f36e34c0-cc70-4a73-b904-d40c504fefa3">
Dec 03 00:12:42 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <system>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="serial">eb1b85fe-471a-46bd-9929-c377144cb8eb</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="uuid">eb1b85fe-471a-46bd-9929-c377144cb8eb</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </system>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <os>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </os>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <features>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </features>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk.config"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:ee:f1:9d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf36e34c0-cc"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/console.log" append="off"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </target>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/console.log" append="off"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </console>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </input>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <video>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </video>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]: </domain>
Dec 03 00:12:42 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.735 187247 DEBUG nova.virt.libvirt.migration [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <name>instance-00000014</name>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <uuid>eb1b85fe-471a-46bd-9929-c377144cb8eb</uuid>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1319059932</nova:name>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:11:29</nova:creationTime>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:12:42 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:12:42 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:user uuid="6048ff4ab0aa45689a23ca16a6558b9d">tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin</nova:user>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:project uuid="8618acb8fd774a27ac00f4e0f10b934c">tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013</nova:project>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:port uuid="f36e34c0-cc70-4a73-b904-d40c504fefa3">
Dec 03 00:12:42 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <system>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="serial">eb1b85fe-471a-46bd-9929-c377144cb8eb</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="uuid">eb1b85fe-471a-46bd-9929-c377144cb8eb</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </system>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <os>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </os>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <features>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </features>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk.config"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:ee:f1:9d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf36e34c0-cc"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/console.log" append="off"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </target>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/console.log" append="off"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </console>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </input>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <video>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </video>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]: </domain>
Dec 03 00:12:42 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.736 187247 DEBUG nova.virt.libvirt.migration [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <name>instance-00000014</name>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <uuid>eb1b85fe-471a-46bd-9929-c377144cb8eb</uuid>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1319059932</nova:name>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:11:29</nova:creationTime>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:12:42 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:12:42 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:user uuid="6048ff4ab0aa45689a23ca16a6558b9d">tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin</nova:user>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:project uuid="8618acb8fd774a27ac00f4e0f10b934c">tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013</nova:project>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <nova:port uuid="f36e34c0-cc70-4a73-b904-d40c504fefa3">
Dec 03 00:12:42 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <system>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="serial">eb1b85fe-471a-46bd-9929-c377144cb8eb</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="uuid">eb1b85fe-471a-46bd-9929-c377144cb8eb</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </system>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <os>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </os>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <features>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </features>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/disk.config"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:ee:f1:9d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf36e34c0-cc"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/console.log" append="off"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:12:42 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       </target>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb/console.log" append="off"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </console>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </input>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <video>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </video>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:12:42 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:12:42 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:12:42 compute-0 nova_compute[187243]: </domain>
Dec 03 00:12:42 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:12:42 compute-0 nova_compute[187243]: 2025-12-03 00:12:42.736 187247 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:12:43 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:43.131 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:43 compute-0 nova_compute[187243]: 2025-12-03 00:12:43.167 187247 DEBUG oslo_concurrency.lockutils [req-e8fcc16a-bc3f-4031-aa0e-0d41168033af req-c62d36b9-5d20-4905-9b08-b327b2477164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-eb1b85fe-471a-46bd-9929-c377144cb8eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:12:43 compute-0 nova_compute[187243]: 2025-12-03 00:12:43.225 187247 DEBUG nova.virt.libvirt.migration [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:12:43 compute-0 nova_compute[187243]: 2025-12-03 00:12:43.226 187247 INFO nova.virt.libvirt.migration [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:12:43 compute-0 nova_compute[187243]: 2025-12-03 00:12:43.436 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:44 compute-0 podman[217534]: 2025-12-03 00:12:44.09225221 +0000 UTC m=+0.053461755 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true)
Dec 03 00:12:44 compute-0 nova_compute[187243]: 2025-12-03 00:12:44.249 187247 INFO nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:12:44 compute-0 nova_compute[187243]: 2025-12-03 00:12:44.759 187247 DEBUG nova.virt.libvirt.migration [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:12:44 compute-0 nova_compute[187243]: 2025-12-03 00:12:44.759 187247 DEBUG nova.virt.libvirt.migration [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 03 00:12:44 compute-0 kernel: tapf36e34c0-cc (unregistering): left promiscuous mode
Dec 03 00:12:44 compute-0 NetworkManager[55671]: <info>  [1764720764.8642] device (tapf36e34c0-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:12:44 compute-0 ovn_controller[95488]: 2025-12-03T00:12:44Z|00146|binding|INFO|Releasing lport f36e34c0-cc70-4a73-b904-d40c504fefa3 from this chassis (sb_readonly=0)
Dec 03 00:12:44 compute-0 nova_compute[187243]: 2025-12-03 00:12:44.925 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:44 compute-0 ovn_controller[95488]: 2025-12-03T00:12:44Z|00147|binding|INFO|Setting lport f36e34c0-cc70-4a73-b904-d40c504fefa3 down in Southbound
Dec 03 00:12:44 compute-0 nova_compute[187243]: 2025-12-03 00:12:44.926 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:44 compute-0 ovn_controller[95488]: 2025-12-03T00:12:44Z|00148|binding|INFO|Removing iface tapf36e34c0-cc ovn-installed in OVS
Dec 03 00:12:44 compute-0 nova_compute[187243]: 2025-12-03 00:12:44.928 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:44.932 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:f1:9d 10.100.0.13'], port_security=['fa:16:3e:ee:f1:9d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'eb1b85fe-471a-46bd-9929-c377144cb8eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44651134-dca8-45c2-963a-1f17aac67593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8618acb8fd774a27ac00f4e0f10b934c', 'neutron:revision_number': '10', 'neutron:security_group_ids': '1aec6d5f-c8c6-4b74-ad3d-5af55712b2e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8f4f3fe-a4b0-48ff-9b01-d63a8cee7576, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=f36e34c0-cc70-4a73-b904-d40c504fefa3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:12:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:44.933 104379 INFO neutron.agent.ovn.metadata.agent [-] Port f36e34c0-cc70-4a73-b904-d40c504fefa3 in datapath 44651134-dca8-45c2-963a-1f17aac67593 unbound from our chassis
Dec 03 00:12:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:44.934 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44651134-dca8-45c2-963a-1f17aac67593, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:12:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:44.934 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fcfb0b39-9ba6-4ada-b2d7-a5ccf1a92f58]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:44.935 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-44651134-dca8-45c2-963a-1f17aac67593 namespace which is not needed anymore
Dec 03 00:12:44 compute-0 nova_compute[187243]: 2025-12-03 00:12:44.938 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:44 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000014.scope: Deactivated successfully.
Dec 03 00:12:44 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000014.scope: Consumed 14.847s CPU time.
Dec 03 00:12:44 compute-0 systemd-machined[153518]: Machine qemu-12-instance-00000014 terminated.
Dec 03 00:12:45 compute-0 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[217232]: [NOTICE]   (217236) : haproxy version is 3.0.5-8e879a5
Dec 03 00:12:45 compute-0 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[217232]: [NOTICE]   (217236) : path to executable is /usr/sbin/haproxy
Dec 03 00:12:45 compute-0 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[217232]: [WARNING]  (217236) : Exiting Master process...
Dec 03 00:12:45 compute-0 podman[217590]: 2025-12-03 00:12:45.041244649 +0000 UTC m=+0.026775894 container kill 2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 03 00:12:45 compute-0 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[217232]: [ALERT]    (217236) : Current worker (217238) exited with code 143 (Terminated)
Dec 03 00:12:45 compute-0 neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593[217232]: [WARNING]  (217236) : All workers exited. Exiting... (0)
Dec 03 00:12:45 compute-0 systemd[1]: libpod-2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99.scope: Deactivated successfully.
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.066 187247 DEBUG nova.compute.manager [req-c021e898-cefa-4fc5-ad84-cf0b58b56a6f req-90ab7dd7-24d3-42e9-be50-7dfd847b2a71 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.067 187247 DEBUG oslo_concurrency.lockutils [req-c021e898-cefa-4fc5-ad84-cf0b58b56a6f req-90ab7dd7-24d3-42e9-be50-7dfd847b2a71 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.067 187247 DEBUG oslo_concurrency.lockutils [req-c021e898-cefa-4fc5-ad84-cf0b58b56a6f req-90ab7dd7-24d3-42e9-be50-7dfd847b2a71 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.067 187247 DEBUG oslo_concurrency.lockutils [req-c021e898-cefa-4fc5-ad84-cf0b58b56a6f req-90ab7dd7-24d3-42e9-be50-7dfd847b2a71 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.067 187247 DEBUG nova.compute.manager [req-c021e898-cefa-4fc5-ad84-cf0b58b56a6f req-90ab7dd7-24d3-42e9-be50-7dfd847b2a71 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] No waiting events found dispatching network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.067 187247 DEBUG nova.compute.manager [req-c021e898-cefa-4fc5-ad84-cf0b58b56a6f req-90ab7dd7-24d3-42e9-be50-7dfd847b2a71 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:12:45 compute-0 podman[217605]: 2025-12-03 00:12:45.0861095 +0000 UTC m=+0.025378219 container died 2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.096 187247 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.097 187247 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.097 187247 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.106 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.106 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:12:45 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99-userdata-shm.mount: Deactivated successfully.
Dec 03 00:12:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-ca6e31c1fdfd30b84a444aa4cc1d81ee451f9f44871f6000da0a6d8eb5c1e32d-merged.mount: Deactivated successfully.
Dec 03 00:12:45 compute-0 podman[217605]: 2025-12-03 00:12:45.124925201 +0000 UTC m=+0.064193920 container cleanup 2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:12:45 compute-0 systemd[1]: libpod-conmon-2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99.scope: Deactivated successfully.
Dec 03 00:12:45 compute-0 podman[217608]: 2025-12-03 00:12:45.146150527 +0000 UTC m=+0.076870284 container remove 2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.148 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:45.161 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d65865a2-f4f9-4835-b990-815299025f79]: (4, ("Wed Dec  3 12:12:45 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593 (2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99)\n2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99\nWed Dec  3 12:12:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-44651134-dca8-45c2-963a-1f17aac67593 (2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99)\n2bd5730a7811485e74589df616c7c501c7d783d95bfcddb20d8fb6bbf9c77d99\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:45.163 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[44d106be-a4ae-498a-a8a2-a63b2c470b50]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:45.163 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/44651134-dca8-45c2-963a-1f17aac67593.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:12:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:45.164 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d6be135e-1b53-4e22-ac7a-33b05f08d23d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:45.164 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44651134-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.165 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:45 compute-0 kernel: tap44651134-d0: left promiscuous mode
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.180 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.182 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:45.183 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[222d167c-21b5-4d24-8a18-ee698c196259]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:45.196 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1369a980-1b4d-478a-91db-7ce457e3b974]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:45.198 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[34e6e6b1-0beb-4c15-a25a-203c3b65b8cc]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:45.212 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[05bbffbc-502b-4a22-b48e-f567320bb5de]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478367, 'reachable_time': 24095, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217658, 'error': None, 'target': 'ovnmeta-44651134-dca8-45c2-963a-1f17aac67593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:45.214 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-44651134-dca8-45c2-963a-1f17aac67593 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:12:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:12:45.214 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[bade896e-988f-4144-b636-1b745ec694e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:12:45 compute-0 systemd[1]: run-netns-ovnmeta\x2d44651134\x2ddca8\x2d45c2\x2d963a\x2d1f17aac67593.mount: Deactivated successfully.
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.262 187247 DEBUG nova.virt.libvirt.guest [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'eb1b85fe-471a-46bd-9929-c377144cb8eb' (instance-00000014) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.262 187247 INFO nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Migration operation has completed
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.263 187247 INFO nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] _post_live_migration() is started..
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.273 187247 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.274 187247 WARNING neutronclient.v2_0.client [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.904 187247 DEBUG nova.compute.manager [req-c98af6f2-bb62-4989-b7a9-1afb602db438 req-e0add953-c07c-4105-99ba-820ca14e863c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.905 187247 DEBUG oslo_concurrency.lockutils [req-c98af6f2-bb62-4989-b7a9-1afb602db438 req-e0add953-c07c-4105-99ba-820ca14e863c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.905 187247 DEBUG oslo_concurrency.lockutils [req-c98af6f2-bb62-4989-b7a9-1afb602db438 req-e0add953-c07c-4105-99ba-820ca14e863c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.905 187247 DEBUG oslo_concurrency.lockutils [req-c98af6f2-bb62-4989-b7a9-1afb602db438 req-e0add953-c07c-4105-99ba-820ca14e863c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.905 187247 DEBUG nova.compute.manager [req-c98af6f2-bb62-4989-b7a9-1afb602db438 req-e0add953-c07c-4105-99ba-820ca14e863c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] No waiting events found dispatching network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:12:45 compute-0 nova_compute[187243]: 2025-12-03 00:12:45.905 187247 DEBUG nova.compute.manager [req-c98af6f2-bb62-4989-b7a9-1afb602db438 req-e0add953-c07c-4105-99ba-820ca14e863c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.674 187247 DEBUG nova.network.neutron [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port f36e34c0-cc70-4a73-b904-d40c504fefa3 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.674 187247 DEBUG nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.675 187247 DEBUG nova.virt.libvirt.vif [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:11:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1319059932',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1319059932',id=20,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:11:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8618acb8fd774a27ac00f4e0f10b934c',ramdisk_id='',reservation_id='r-0nyztijh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1016189013-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:12:24Z,user_data=None,user_id='6048ff4ab0aa45689a23ca16a6558b9d',uuid=eb1b85fe-471a-46bd-9929-c377144cb8eb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.675 187247 DEBUG nova.network.os_vif_util [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "address": "fa:16:3e:ee:f1:9d", "network": {"id": "44651134-dca8-45c2-963a-1f17aac67593", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-82173201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05a149bd8b504e438531bb5b9409e4db", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf36e34c0-cc", "ovs_interfaceid": "f36e34c0-cc70-4a73-b904-d40c504fefa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.676 187247 DEBUG nova.network.os_vif_util [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.676 187247 DEBUG os_vif [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.677 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.678 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf36e34c0-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.679 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.681 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.682 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.682 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.682 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0d2552a7-ddb0-46b3-8339-64cacd3de002) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.683 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.684 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.685 187247 INFO os_vif [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f1:9d,bridge_name='br-int',has_traffic_filtering=True,id=f36e34c0-cc70-4a73-b904-d40c504fefa3,network=Network(44651134-dca8-45c2-963a-1f17aac67593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf36e34c0-cc')
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.685 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.686 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.686 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.686 187247 DEBUG nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.686 187247 INFO nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Deleting instance files /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb_del
Dec 03 00:12:46 compute-0 nova_compute[187243]: 2025-12-03 00:12:46.687 187247 INFO nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Deletion of /var/lib/nova/instances/eb1b85fe-471a-46bd-9929-c377144cb8eb_del complete
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.108 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.108 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.109 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.109 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.137 187247 DEBUG nova.compute.manager [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.137 187247 DEBUG oslo_concurrency.lockutils [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.138 187247 DEBUG oslo_concurrency.lockutils [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.138 187247 DEBUG oslo_concurrency.lockutils [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.138 187247 DEBUG nova.compute.manager [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] No waiting events found dispatching network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.139 187247 WARNING nova.compute.manager [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received unexpected event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 for instance with vm_state active and task_state migrating.
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.139 187247 DEBUG nova.compute.manager [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.139 187247 DEBUG oslo_concurrency.lockutils [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.139 187247 DEBUG oslo_concurrency.lockutils [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.140 187247 DEBUG oslo_concurrency.lockutils [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.140 187247 DEBUG nova.compute.manager [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] No waiting events found dispatching network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.140 187247 DEBUG nova.compute.manager [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-unplugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.140 187247 DEBUG nova.compute.manager [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.141 187247 DEBUG oslo_concurrency.lockutils [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.141 187247 DEBUG oslo_concurrency.lockutils [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.141 187247 DEBUG oslo_concurrency.lockutils [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.141 187247 DEBUG nova.compute.manager [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] No waiting events found dispatching network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.142 187247 WARNING nova.compute.manager [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received unexpected event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 for instance with vm_state active and task_state migrating.
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.142 187247 DEBUG nova.compute.manager [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.142 187247 DEBUG oslo_concurrency.lockutils [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.142 187247 DEBUG oslo_concurrency.lockutils [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.143 187247 DEBUG oslo_concurrency.lockutils [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.143 187247 DEBUG nova.compute.manager [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] No waiting events found dispatching network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.143 187247 WARNING nova.compute.manager [req-2d772b3e-7d57-498f-8291-efdaf67bb3bb req-3f216e3b-9071-42e5-b676-c173c24c4e5e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Received unexpected event network-vif-plugged-f36e34c0-cc70-4a73-b904-d40c504fefa3 for instance with vm_state active and task_state migrating.
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.265 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.266 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.283 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.284 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5810MB free_disk=73.16228103637695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.284 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:47 compute-0 nova_compute[187243]: 2025-12-03 00:12:47.284 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:48 compute-0 nova_compute[187243]: 2025-12-03 00:12:48.303 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Updating resource usage from migration 8e05e9e2-afa5-4a3a-a936-f7c5663fe52f
Dec 03 00:12:48 compute-0 nova_compute[187243]: 2025-12-03 00:12:48.335 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration 8e05e9e2-afa5-4a3a-a936-f7c5663fe52f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:12:48 compute-0 nova_compute[187243]: 2025-12-03 00:12:48.335 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:12:48 compute-0 nova_compute[187243]: 2025-12-03 00:12:48.335 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:12:47 up  1:20,  0 user,  load average: 0.22, 0.19, 0.27\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_8618acb8fd774a27ac00f4e0f10b934c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:12:48 compute-0 nova_compute[187243]: 2025-12-03 00:12:48.365 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:12:48 compute-0 nova_compute[187243]: 2025-12-03 00:12:48.438 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:48 compute-0 nova_compute[187243]: 2025-12-03 00:12:48.873 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:12:49 compute-0 nova_compute[187243]: 2025-12-03 00:12:49.381 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:12:49 compute-0 nova_compute[187243]: 2025-12-03 00:12:49.381 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:49 compute-0 nova_compute[187243]: 2025-12-03 00:12:49.381 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:50 compute-0 sshd[128750]: Timeout before authentication for connection from 101.47.140.127 to 38.102.83.77, pid = 216915
Dec 03 00:12:50 compute-0 nova_compute[187243]: 2025-12-03 00:12:50.887 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:50 compute-0 nova_compute[187243]: 2025-12-03 00:12:50.887 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:50 compute-0 nova_compute[187243]: 2025-12-03 00:12:50.888 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:50 compute-0 nova_compute[187243]: 2025-12-03 00:12:50.888 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:51 compute-0 nova_compute[187243]: 2025-12-03 00:12:51.685 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:53 compute-0 nova_compute[187243]: 2025-12-03 00:12:53.497 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:54 compute-0 podman[217661]: 2025-12-03 00:12:54.085334854 +0000 UTC m=+0.045944029 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:12:54 compute-0 nova_compute[187243]: 2025-12-03 00:12:54.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:54 compute-0 nova_compute[187243]: 2025-12-03 00:12:54.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:12:55 compute-0 nova_compute[187243]: 2025-12-03 00:12:55.102 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:12:56 compute-0 nova_compute[187243]: 2025-12-03 00:12:56.688 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:57 compute-0 podman[217686]: 2025-12-03 00:12:57.128055409 +0000 UTC m=+0.080424531 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:12:57 compute-0 podman[217687]: 2025-12-03 00:12:57.132451048 +0000 UTC m=+0.086482742 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:12:57 compute-0 nova_compute[187243]: 2025-12-03 00:12:57.220 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:57 compute-0 nova_compute[187243]: 2025-12-03 00:12:57.220 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:57 compute-0 nova_compute[187243]: 2025-12-03 00:12:57.220 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "eb1b85fe-471a-46bd-9929-c377144cb8eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:57 compute-0 nova_compute[187243]: 2025-12-03 00:12:57.738 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:57 compute-0 nova_compute[187243]: 2025-12-03 00:12:57.739 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:57 compute-0 nova_compute[187243]: 2025-12-03 00:12:57.739 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:12:57 compute-0 nova_compute[187243]: 2025-12-03 00:12:57.739 187247 DEBUG nova.compute.resource_tracker [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:12:57 compute-0 nova_compute[187243]: 2025-12-03 00:12:57.880 187247 WARNING nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:12:57 compute-0 nova_compute[187243]: 2025-12-03 00:12:57.881 187247 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:12:57 compute-0 nova_compute[187243]: 2025-12-03 00:12:57.907 187247 DEBUG oslo_concurrency.processutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:12:57 compute-0 nova_compute[187243]: 2025-12-03 00:12:57.908 187247 DEBUG nova.compute.resource_tracker [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5824MB free_disk=73.1623306274414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:12:57 compute-0 nova_compute[187243]: 2025-12-03 00:12:57.908 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:12:57 compute-0 nova_compute[187243]: 2025-12-03 00:12:57.908 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:12:58 compute-0 nova_compute[187243]: 2025-12-03 00:12:58.499 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:12:58 compute-0 nova_compute[187243]: 2025-12-03 00:12:58.933 187247 DEBUG nova.compute.resource_tracker [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance eb1b85fe-471a-46bd-9929-c377144cb8eb refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:12:59 compute-0 nova_compute[187243]: 2025-12-03 00:12:59.097 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:12:59 compute-0 nova_compute[187243]: 2025-12-03 00:12:59.442 187247 DEBUG nova.compute.resource_tracker [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:12:59 compute-0 nova_compute[187243]: 2025-12-03 00:12:59.473 187247 DEBUG nova.compute.resource_tracker [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration 8e05e9e2-afa5-4a3a-a936-f7c5663fe52f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:12:59 compute-0 nova_compute[187243]: 2025-12-03 00:12:59.473 187247 DEBUG nova.compute.resource_tracker [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:12:59 compute-0 nova_compute[187243]: 2025-12-03 00:12:59.473 187247 DEBUG nova.compute.resource_tracker [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:12:57 up  1:21,  0 user,  load average: 0.19, 0.18, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:12:59 compute-0 nova_compute[187243]: 2025-12-03 00:12:59.528 187247 DEBUG nova.compute.provider_tree [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:12:59 compute-0 podman[197600]: time="2025-12-03T00:12:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:12:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:12:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:12:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:12:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Dec 03 00:13:00 compute-0 nova_compute[187243]: 2025-12-03 00:13:00.036 187247 DEBUG nova.scheduler.client.report [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:13:00 compute-0 nova_compute[187243]: 2025-12-03 00:13:00.545 187247 DEBUG nova.compute.resource_tracker [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:13:00 compute-0 nova_compute[187243]: 2025-12-03 00:13:00.545 187247 DEBUG oslo_concurrency.lockutils [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.637s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:00 compute-0 nova_compute[187243]: 2025-12-03 00:13:00.563 187247 INFO nova.compute.manager [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 03 00:13:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:00.705 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:00.705 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:00.705 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:01 compute-0 openstack_network_exporter[199746]: ERROR   00:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:13:01 compute-0 openstack_network_exporter[199746]: ERROR   00:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:13:01 compute-0 openstack_network_exporter[199746]: ERROR   00:13:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:13:01 compute-0 openstack_network_exporter[199746]: ERROR   00:13:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:13:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:13:01 compute-0 openstack_network_exporter[199746]: ERROR   00:13:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:13:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:13:01 compute-0 nova_compute[187243]: 2025-12-03 00:13:01.651 187247 INFO nova.scheduler.client.report [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration 8e05e9e2-afa5-4a3a-a936-f7c5663fe52f
Dec 03 00:13:01 compute-0 nova_compute[187243]: 2025-12-03 00:13:01.652 187247 DEBUG nova.virt.libvirt.driver [None req-2f8805ab-ebe3-4718-a8c2-5ca62644d4b5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: eb1b85fe-471a-46bd-9929-c377144cb8eb] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:13:01 compute-0 nova_compute[187243]: 2025-12-03 00:13:01.690 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:03 compute-0 nova_compute[187243]: 2025-12-03 00:13:03.538 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:06 compute-0 nova_compute[187243]: 2025-12-03 00:13:06.693 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:08 compute-0 nova_compute[187243]: 2025-12-03 00:13:08.585 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:11 compute-0 nova_compute[187243]: 2025-12-03 00:13:11.696 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:12 compute-0 podman[217734]: 2025-12-03 00:13:12.110783319 +0000 UTC m=+0.062989371 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 03 00:13:13 compute-0 nova_compute[187243]: 2025-12-03 00:13:13.183 187247 DEBUG nova.compute.manager [None req-147179a7-5c40-4e7a-a40d-f34eeb10f73a 7ede684cab6e46758f9d1100711cfe79 22106c97f2524355a0bbadb98eaf5c22 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Dec 03 00:13:13 compute-0 nova_compute[187243]: 2025-12-03 00:13:13.224 187247 DEBUG nova.compute.provider_tree [None req-147179a7-5c40-4e7a-a40d-f34eeb10f73a 7ede684cab6e46758f9d1100711cfe79 22106c97f2524355a0bbadb98eaf5c22 - - default default] Updating resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 generation from 36 to 39 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 03 00:13:13 compute-0 nova_compute[187243]: 2025-12-03 00:13:13.586 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:15 compute-0 podman[217755]: 2025-12-03 00:13:15.109774292 +0000 UTC m=+0.068143279 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:13:16 compute-0 nova_compute[187243]: 2025-12-03 00:13:16.699 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:16 compute-0 sshd[128750]: drop connection #0 from [101.47.140.127]:47116 on [38.102.83.77]:22 penalty: exceeded LoginGraceTime
Dec 03 00:13:18 compute-0 nova_compute[187243]: 2025-12-03 00:13:18.507 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:18 compute-0 nova_compute[187243]: 2025-12-03 00:13:18.638 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:21 compute-0 nova_compute[187243]: 2025-12-03 00:13:21.702 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:23 compute-0 nova_compute[187243]: 2025-12-03 00:13:23.639 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:24 compute-0 nova_compute[187243]: 2025-12-03 00:13:24.939 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:25 compute-0 podman[217776]: 2025-12-03 00:13:25.110527377 +0000 UTC m=+0.059116685 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:13:26 compute-0 nova_compute[187243]: 2025-12-03 00:13:26.705 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:28 compute-0 podman[217800]: 2025-12-03 00:13:28.122810088 +0000 UTC m=+0.067719999 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:13:28 compute-0 podman[217801]: 2025-12-03 00:13:28.184876473 +0000 UTC m=+0.124100883 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Dec 03 00:13:28 compute-0 nova_compute[187243]: 2025-12-03 00:13:28.642 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:29 compute-0 sshd-session[217843]: Received disconnect from 20.123.120.169 port 51616:11: Bye Bye [preauth]
Dec 03 00:13:29 compute-0 sshd-session[217843]: Disconnected from authenticating user root 20.123.120.169 port 51616 [preauth]
Dec 03 00:13:29 compute-0 podman[197600]: time="2025-12-03T00:13:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:13:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:13:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:13:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:13:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Dec 03 00:13:31 compute-0 openstack_network_exporter[199746]: ERROR   00:13:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:13:31 compute-0 openstack_network_exporter[199746]: ERROR   00:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:13:31 compute-0 openstack_network_exporter[199746]: ERROR   00:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:13:31 compute-0 openstack_network_exporter[199746]: ERROR   00:13:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:13:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:13:31 compute-0 openstack_network_exporter[199746]: ERROR   00:13:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:13:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:13:31 compute-0 nova_compute[187243]: 2025-12-03 00:13:31.707 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:32.857 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:2b:fe 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1299554f9c3e4ee7a7991ca25c47f7c1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=42f0d9e7-7c77-4247-8972-6beac3a53206) old=Port_Binding(mac=['fa:16:3e:99:2b:fe'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1299554f9c3e4ee7a7991ca25c47f7c1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:13:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:32.858 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 42f0d9e7-7c77-4247-8972-6beac3a53206 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd updated
Dec 03 00:13:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:32.858 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:13:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:32.860 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[abbe0095-3e66-40de-b07e-2a06e7359e67]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:33 compute-0 nova_compute[187243]: 2025-12-03 00:13:33.698 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:36 compute-0 sshd-session[217845]: Invalid user mysql from 49.247.36.49 port 57209
Dec 03 00:13:36 compute-0 sshd-session[217845]: Received disconnect from 49.247.36.49 port 57209:11: Bye Bye [preauth]
Dec 03 00:13:36 compute-0 sshd-session[217845]: Disconnected from invalid user mysql 49.247.36.49 port 57209 [preauth]
Dec 03 00:13:36 compute-0 nova_compute[187243]: 2025-12-03 00:13:36.709 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:38 compute-0 nova_compute[187243]: 2025-12-03 00:13:38.699 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:39 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:39.873 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:13:39 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:39.873 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:13:39 compute-0 nova_compute[187243]: 2025-12-03 00:13:39.875 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:40 compute-0 sshd-session[217848]: Invalid user kiosk from 23.95.37.90 port 34878
Dec 03 00:13:40 compute-0 sshd-session[217848]: Received disconnect from 23.95.37.90 port 34878:11: Bye Bye [preauth]
Dec 03 00:13:40 compute-0 sshd-session[217848]: Disconnected from invalid user kiosk 23.95.37.90 port 34878 [preauth]
Dec 03 00:13:40 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:40.874 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:13:40 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:40.970 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:35:a1 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4ec4e0a2-2d69-48a4-b43f-5378b9156efd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ec4e0a2-2d69-48a4-b43f-5378b9156efd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ca79c0e-a98f-49bb-a5b9-e71f73a04bad, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=8a7bc21f-8de6-41a4-bcee-f8a6bbb9133f) old=Port_Binding(mac=['fa:16:3e:92:35:a1'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4ec4e0a2-2d69-48a4-b43f-5378b9156efd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ec4e0a2-2d69-48a4-b43f-5378b9156efd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:13:40 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:40.971 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 8a7bc21f-8de6-41a4-bcee-f8a6bbb9133f in datapath 4ec4e0a2-2d69-48a4-b43f-5378b9156efd updated
Dec 03 00:13:40 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:40.972 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4ec4e0a2-2d69-48a4-b43f-5378b9156efd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:13:40 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:13:40.972 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8f5ebf48-5c82-427a-b5d8-f60779e602b7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:13:41 compute-0 nova_compute[187243]: 2025-12-03 00:13:41.596 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:41 compute-0 nova_compute[187243]: 2025-12-03 00:13:41.596 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:41 compute-0 nova_compute[187243]: 2025-12-03 00:13:41.743 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:43 compute-0 podman[217851]: 2025-12-03 00:13:43.098958956 +0000 UTC m=+0.056576772 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:13:43 compute-0 nova_compute[187243]: 2025-12-03 00:13:43.700 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:44 compute-0 nova_compute[187243]: 2025-12-03 00:13:44.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:44 compute-0 nova_compute[187243]: 2025-12-03 00:13:44.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:13:46 compute-0 podman[217872]: 2025-12-03 00:13:46.106272505 +0000 UTC m=+0.061680419 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:13:46 compute-0 nova_compute[187243]: 2025-12-03 00:13:46.745 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:47 compute-0 nova_compute[187243]: 2025-12-03 00:13:47.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:48 compute-0 nova_compute[187243]: 2025-12-03 00:13:48.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:48 compute-0 nova_compute[187243]: 2025-12-03 00:13:48.702 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:49 compute-0 nova_compute[187243]: 2025-12-03 00:13:49.103 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:49 compute-0 nova_compute[187243]: 2025-12-03 00:13:49.103 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:49 compute-0 nova_compute[187243]: 2025-12-03 00:13:49.103 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:49 compute-0 nova_compute[187243]: 2025-12-03 00:13:49.103 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:13:49 compute-0 nova_compute[187243]: 2025-12-03 00:13:49.230 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:13:49 compute-0 nova_compute[187243]: 2025-12-03 00:13:49.231 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:13:49 compute-0 nova_compute[187243]: 2025-12-03 00:13:49.248 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:13:49 compute-0 nova_compute[187243]: 2025-12-03 00:13:49.249 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5820MB free_disk=73.16234970092773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:13:49 compute-0 nova_compute[187243]: 2025-12-03 00:13:49.249 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:49 compute-0 nova_compute[187243]: 2025-12-03 00:13:49.249 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:49 compute-0 nova_compute[187243]: 2025-12-03 00:13:49.499 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:49 compute-0 nova_compute[187243]: 2025-12-03 00:13:49.500 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:50 compute-0 nova_compute[187243]: 2025-12-03 00:13:50.005 187247 DEBUG nova.compute.manager [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:13:50 compute-0 nova_compute[187243]: 2025-12-03 00:13:50.550 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:50 compute-0 nova_compute[187243]: 2025-12-03 00:13:50.855 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance 799be56b-eb56-4319-a027-b0fe2cf7991f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1797
Dec 03 00:13:50 compute-0 nova_compute[187243]: 2025-12-03 00:13:50.856 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:13:50 compute-0 nova_compute[187243]: 2025-12-03 00:13:50.856 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:13:49 up  1:21,  0 user,  load average: 0.08, 0.15, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:13:50 compute-0 nova_compute[187243]: 2025-12-03 00:13:50.930 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:13:51 compute-0 nova_compute[187243]: 2025-12-03 00:13:51.729 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:13:51 compute-0 nova_compute[187243]: 2025-12-03 00:13:51.747 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:52 compute-0 nova_compute[187243]: 2025-12-03 00:13:52.240 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:13:52 compute-0 nova_compute[187243]: 2025-12-03 00:13:52.240 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.991s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:52 compute-0 nova_compute[187243]: 2025-12-03 00:13:52.241 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 1.691s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:52 compute-0 nova_compute[187243]: 2025-12-03 00:13:52.252 187247 DEBUG nova.virt.hardware [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:13:52 compute-0 nova_compute[187243]: 2025-12-03 00:13:52.252 187247 INFO nova.compute.claims [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:13:52 compute-0 sshd-session[217894]: Invalid user username from 102.210.148.92 port 59094
Dec 03 00:13:53 compute-0 sshd-session[217894]: Received disconnect from 102.210.148.92 port 59094:11: Bye Bye [preauth]
Dec 03 00:13:53 compute-0 sshd-session[217894]: Disconnected from invalid user username 102.210.148.92 port 59094 [preauth]
Dec 03 00:13:53 compute-0 nova_compute[187243]: 2025-12-03 00:13:53.136 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:53 compute-0 nova_compute[187243]: 2025-12-03 00:13:53.137 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:53 compute-0 nova_compute[187243]: 2025-12-03 00:13:53.137 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:53 compute-0 nova_compute[187243]: 2025-12-03 00:13:53.138 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:13:53 compute-0 nova_compute[187243]: 2025-12-03 00:13:53.307 187247 DEBUG nova.compute.provider_tree [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:13:53 compute-0 nova_compute[187243]: 2025-12-03 00:13:53.652 187247 WARNING nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.
Dec 03 00:13:53 compute-0 nova_compute[187243]: 2025-12-03 00:13:53.653 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Triggering sync for uuid 799be56b-eb56-4319-a027-b0fe2cf7991f _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Dec 03 00:13:53 compute-0 nova_compute[187243]: 2025-12-03 00:13:53.653 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:53 compute-0 nova_compute[187243]: 2025-12-03 00:13:53.705 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:53 compute-0 ovn_controller[95488]: 2025-12-03T00:13:53Z|00149|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 03 00:13:53 compute-0 nova_compute[187243]: 2025-12-03 00:13:53.816 187247 DEBUG nova.scheduler.client.report [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:13:53 compute-0 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 00:13:54 compute-0 nova_compute[187243]: 2025-12-03 00:13:54.325 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.084s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:54 compute-0 nova_compute[187243]: 2025-12-03 00:13:54.326 187247 DEBUG nova.compute.manager [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:13:54 compute-0 nova_compute[187243]: 2025-12-03 00:13:54.838 187247 DEBUG nova.compute.manager [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:13:54 compute-0 nova_compute[187243]: 2025-12-03 00:13:54.838 187247 DEBUG nova.network.neutron [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:13:54 compute-0 nova_compute[187243]: 2025-12-03 00:13:54.838 187247 WARNING neutronclient.v2_0.client [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:13:54 compute-0 nova_compute[187243]: 2025-12-03 00:13:54.839 187247 WARNING neutronclient.v2_0.client [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:13:55 compute-0 nova_compute[187243]: 2025-12-03 00:13:55.351 187247 INFO nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:13:55 compute-0 nova_compute[187243]: 2025-12-03 00:13:55.925 187247 DEBUG nova.compute.manager [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:13:56 compute-0 podman[217897]: 2025-12-03 00:13:56.098668783 +0000 UTC m=+0.056702345 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:13:56 compute-0 nova_compute[187243]: 2025-12-03 00:13:56.749 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:56 compute-0 nova_compute[187243]: 2025-12-03 00:13:56.823 187247 DEBUG nova.network.neutron [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Successfully created port: de4a7aac-87a1-4237-9c69-504ca4fa7d87 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:13:56 compute-0 nova_compute[187243]: 2025-12-03 00:13:56.946 187247 DEBUG nova.compute.manager [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:13:56 compute-0 nova_compute[187243]: 2025-12-03 00:13:56.947 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:13:56 compute-0 nova_compute[187243]: 2025-12-03 00:13:56.947 187247 INFO nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Creating image(s)
Dec 03 00:13:56 compute-0 nova_compute[187243]: 2025-12-03 00:13:56.948 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:56 compute-0 nova_compute[187243]: 2025-12-03 00:13:56.948 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:56 compute-0 nova_compute[187243]: 2025-12-03 00:13:56.948 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:56 compute-0 nova_compute[187243]: 2025-12-03 00:13:56.949 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:13:56 compute-0 nova_compute[187243]: 2025-12-03 00:13:56.952 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:13:56 compute-0 nova_compute[187243]: 2025-12-03 00:13:56.953 187247 DEBUG oslo_concurrency.processutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.008 187247 DEBUG oslo_concurrency.processutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.009 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.010 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.010 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.013 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.014 187247 DEBUG oslo_concurrency.processutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.063 187247 DEBUG oslo_concurrency.processutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.064 187247 DEBUG oslo_concurrency.processutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.099 187247 DEBUG oslo_concurrency.processutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.100 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.101 187247 DEBUG oslo_concurrency.processutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.151 187247 DEBUG oslo_concurrency.processutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.152 187247 DEBUG nova.virt.disk.api [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Checking if we can resize image /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.153 187247 DEBUG oslo_concurrency.processutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.218 187247 DEBUG oslo_concurrency.processutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.219 187247 DEBUG nova.virt.disk.api [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Cannot resize image /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.220 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.220 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Ensure instance console log exists: /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.221 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.221 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:13:57 compute-0 nova_compute[187243]: 2025-12-03 00:13:57.221 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:13:58 compute-0 nova_compute[187243]: 2025-12-03 00:13:58.706 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:13:58 compute-0 nova_compute[187243]: 2025-12-03 00:13:58.855 187247 DEBUG nova.network.neutron [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Successfully updated port: de4a7aac-87a1-4237-9c69-504ca4fa7d87 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:13:58 compute-0 nova_compute[187243]: 2025-12-03 00:13:58.909 187247 DEBUG nova.compute.manager [req-a3b14e26-e062-4abe-b1dc-553f4fdf3265 req-55d11c2b-48ca-4b64-908a-f451564885ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-changed-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:13:58 compute-0 nova_compute[187243]: 2025-12-03 00:13:58.909 187247 DEBUG nova.compute.manager [req-a3b14e26-e062-4abe-b1dc-553f4fdf3265 req-55d11c2b-48ca-4b64-908a-f451564885ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Refreshing instance network info cache due to event network-changed-de4a7aac-87a1-4237-9c69-504ca4fa7d87. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:13:58 compute-0 nova_compute[187243]: 2025-12-03 00:13:58.910 187247 DEBUG oslo_concurrency.lockutils [req-a3b14e26-e062-4abe-b1dc-553f4fdf3265 req-55d11c2b-48ca-4b64-908a-f451564885ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:13:58 compute-0 nova_compute[187243]: 2025-12-03 00:13:58.910 187247 DEBUG oslo_concurrency.lockutils [req-a3b14e26-e062-4abe-b1dc-553f4fdf3265 req-55d11c2b-48ca-4b64-908a-f451564885ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:13:58 compute-0 nova_compute[187243]: 2025-12-03 00:13:58.910 187247 DEBUG nova.network.neutron [req-a3b14e26-e062-4abe-b1dc-553f4fdf3265 req-55d11c2b-48ca-4b64-908a-f451564885ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Refreshing network info cache for port de4a7aac-87a1-4237-9c69-504ca4fa7d87 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:13:59 compute-0 podman[217936]: 2025-12-03 00:13:59.10351273 +0000 UTC m=+0.055478525 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:13:59 compute-0 podman[217937]: 2025-12-03 00:13:59.189252863 +0000 UTC m=+0.141807242 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:13:59 compute-0 nova_compute[187243]: 2025-12-03 00:13:59.360 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:13:59 compute-0 nova_compute[187243]: 2025-12-03 00:13:59.416 187247 WARNING neutronclient.v2_0.client [req-a3b14e26-e062-4abe-b1dc-553f4fdf3265 req-55d11c2b-48ca-4b64-908a-f451564885ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:13:59 compute-0 nova_compute[187243]: 2025-12-03 00:13:59.679 187247 DEBUG nova.network.neutron [req-a3b14e26-e062-4abe-b1dc-553f4fdf3265 req-55d11c2b-48ca-4b64-908a-f451564885ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:13:59 compute-0 podman[197600]: time="2025-12-03T00:13:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:13:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:13:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:13:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:13:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Dec 03 00:13:59 compute-0 nova_compute[187243]: 2025-12-03 00:13:59.849 187247 DEBUG nova.network.neutron [req-a3b14e26-e062-4abe-b1dc-553f4fdf3265 req-55d11c2b-48ca-4b64-908a-f451564885ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:14:00 compute-0 nova_compute[187243]: 2025-12-03 00:14:00.355 187247 DEBUG oslo_concurrency.lockutils [req-a3b14e26-e062-4abe-b1dc-553f4fdf3265 req-55d11c2b-48ca-4b64-908a-f451564885ba 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:14:00 compute-0 nova_compute[187243]: 2025-12-03 00:14:00.355 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquired lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:14:00 compute-0 nova_compute[187243]: 2025-12-03 00:14:00.356 187247 DEBUG nova.network.neutron [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:14:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:00.706 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:00.706 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:00.706 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:01 compute-0 sshd-session[217983]: Received disconnect from 61.220.235.10 port 58144:11: Bye Bye [preauth]
Dec 03 00:14:01 compute-0 sshd-session[217983]: Disconnected from authenticating user root 61.220.235.10 port 58144 [preauth]
Dec 03 00:14:01 compute-0 openstack_network_exporter[199746]: ERROR   00:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:14:01 compute-0 openstack_network_exporter[199746]: ERROR   00:14:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:14:01 compute-0 openstack_network_exporter[199746]: ERROR   00:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:14:01 compute-0 openstack_network_exporter[199746]: ERROR   00:14:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:14:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:14:01 compute-0 openstack_network_exporter[199746]: ERROR   00:14:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:14:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:14:01 compute-0 nova_compute[187243]: 2025-12-03 00:14:01.708 187247 DEBUG nova.network.neutron [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:14:01 compute-0 nova_compute[187243]: 2025-12-03 00:14:01.750 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.045 187247 WARNING neutronclient.v2_0.client [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.214 187247 DEBUG nova.network.neutron [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Updating instance_info_cache with network_info: [{"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:14:02 compute-0 sshd-session[217985]: Invalid user desliga from 45.78.222.160 port 46886
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.723 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Releasing lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.724 187247 DEBUG nova.compute.manager [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Instance network_info: |[{"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.729 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Start _get_guest_xml network_info=[{"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.733 187247 WARNING nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.736 187247 DEBUG nova.virt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalanceStrategy-server-1118620355', uuid='799be56b-eb56-4319-a027-b0fe2cf7991f'), owner=OwnerMeta(userid='0473307cd38b412cbfdbd093053eb1af', username='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin', projectid='e510a0888b4c4fb5860a0f1720b8ed4b', projectname='tempest-TestExecuteWorkloadBalanceStrategy-1290727110'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720842.735938) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.741 187247 DEBUG nova.virt.libvirt.host [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.742 187247 DEBUG nova.virt.libvirt.host [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.746 187247 DEBUG nova.virt.libvirt.host [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.748 187247 DEBUG nova.virt.libvirt.host [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.750 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.750 187247 DEBUG nova.virt.hardware [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.751 187247 DEBUG nova.virt.hardware [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.752 187247 DEBUG nova.virt.hardware [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.752 187247 DEBUG nova.virt.hardware [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.753 187247 DEBUG nova.virt.hardware [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.753 187247 DEBUG nova.virt.hardware [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.754 187247 DEBUG nova.virt.hardware [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.754 187247 DEBUG nova.virt.hardware [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.755 187247 DEBUG nova.virt.hardware [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.755 187247 DEBUG nova.virt.hardware [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.755 187247 DEBUG nova.virt.hardware [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.762 187247 DEBUG nova.virt.libvirt.vif [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:13:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1118620355',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1118620355',id=22,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-103ztj3x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:13:55Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=799be56b-eb56-4319-a027-b0fe2cf7991f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.763 187247 DEBUG nova.network.os_vif_util [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converting VIF {"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.764 187247 DEBUG nova.network.os_vif_util [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:14:02 compute-0 nova_compute[187243]: 2025-12-03 00:14:02.765 187247 DEBUG nova.objects.instance [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lazy-loading 'pci_devices' on Instance uuid 799be56b-eb56-4319-a027-b0fe2cf7991f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.545 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:14:03 compute-0 nova_compute[187243]:   <uuid>799be56b-eb56-4319-a027-b0fe2cf7991f</uuid>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   <name>instance-00000016</name>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1118620355</nova:name>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:14:02</nova:creationTime>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:14:03 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:14:03 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:user uuid="0473307cd38b412cbfdbd093053eb1af">tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin</nova:user>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:project uuid="e510a0888b4c4fb5860a0f1720b8ed4b">tempest-TestExecuteWorkloadBalanceStrategy-1290727110</nova:project>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         <nova:port uuid="de4a7aac-87a1-4237-9c69-504ca4fa7d87">
Dec 03 00:14:03 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <system>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <entry name="serial">799be56b-eb56-4319-a027-b0fe2cf7991f</entry>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <entry name="uuid">799be56b-eb56-4319-a027-b0fe2cf7991f</entry>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     </system>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   <os>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   </os>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   <features>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   </features>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk.config"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:da:29:37"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <target dev="tapde4a7aac-87"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/console.log" append="off"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <video>
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     </video>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:14:03 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:14:03 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:14:03 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:14:03 compute-0 nova_compute[187243]: </domain>
Dec 03 00:14:03 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.547 187247 DEBUG nova.compute.manager [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Preparing to wait for external event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.547 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.547 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.547 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.548 187247 DEBUG nova.virt.libvirt.vif [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:13:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1118620355',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1118620355',id=22,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-103ztj3x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:13:55Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=799be56b-eb56-4319-a027-b0fe2cf7991f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.548 187247 DEBUG nova.network.os_vif_util [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converting VIF {"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.549 187247 DEBUG nova.network.os_vif_util [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.549 187247 DEBUG os_vif [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.550 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.550 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.551 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.551 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.552 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd80bb76c-ed29-5ef3-a19d-c23d06690b98', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.553 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.554 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.555 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.558 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.558 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde4a7aac-87, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.559 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapde4a7aac-87, col_values=(('qos', UUID('a2f4b668-9d3a-4828-a8e2-768d9318749b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.559 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapde4a7aac-87, col_values=(('external_ids', {'iface-id': 'de4a7aac-87a1-4237-9c69-504ca4fa7d87', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:29:37', 'vm-uuid': '799be56b-eb56-4319-a027-b0fe2cf7991f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.561 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:03 compute-0 NetworkManager[55671]: <info>  [1764720843.5618] manager: (tapde4a7aac-87): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.563 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.566 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.567 187247 INFO os_vif [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87')
Dec 03 00:14:03 compute-0 nova_compute[187243]: 2025-12-03 00:14:03.755 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:05 compute-0 nova_compute[187243]: 2025-12-03 00:14:05.114 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:14:05 compute-0 nova_compute[187243]: 2025-12-03 00:14:05.115 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:14:05 compute-0 nova_compute[187243]: 2025-12-03 00:14:05.115 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] No VIF found with MAC fa:16:3e:da:29:37, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:14:05 compute-0 nova_compute[187243]: 2025-12-03 00:14:05.116 187247 INFO nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Using config drive
Dec 03 00:14:05 compute-0 nova_compute[187243]: 2025-12-03 00:14:05.626 187247 WARNING neutronclient.v2_0.client [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:14:05 compute-0 nova_compute[187243]: 2025-12-03 00:14:05.855 187247 INFO nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Creating config drive at /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk.config
Dec 03 00:14:05 compute-0 nova_compute[187243]: 2025-12-03 00:14:05.862 187247 DEBUG oslo_concurrency.processutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmplk5h5ql2 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:05 compute-0 nova_compute[187243]: 2025-12-03 00:14:05.992 187247 DEBUG oslo_concurrency.processutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmplk5h5ql2" returned: 0 in 0.129s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:06 compute-0 kernel: tapde4a7aac-87: entered promiscuous mode
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.068 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:06 compute-0 NetworkManager[55671]: <info>  [1764720846.0698] manager: (tapde4a7aac-87): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Dec 03 00:14:06 compute-0 ovn_controller[95488]: 2025-12-03T00:14:06Z|00150|binding|INFO|Claiming lport de4a7aac-87a1-4237-9c69-504ca4fa7d87 for this chassis.
Dec 03 00:14:06 compute-0 ovn_controller[95488]: 2025-12-03T00:14:06Z|00151|binding|INFO|de4a7aac-87a1-4237-9c69-504ca4fa7d87: Claiming fa:16:3e:da:29:37 10.100.0.8
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.075 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.090 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:29:37 10.100.0.8'], port_security=['fa:16:3e:da:29:37 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '799be56b-eb56-4319-a027-b0fe2cf7991f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=de4a7aac-87a1-4237-9c69-504ca4fa7d87) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.091 104379 INFO neutron.agent.ovn.metadata.agent [-] Port de4a7aac-87a1-4237-9c69-504ca4fa7d87 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd bound to our chassis
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.092 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:14:06 compute-0 systemd-udevd[218008]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.107 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[913b0f87-bba0-489e-9b8b-cda6ac8733f7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.109 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee60e03c-a1 in ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.110 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee60e03c-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.111 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d69ed182-6333-4e04-a803-09ced59e7f61]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.111 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[bf2aca38-0673-4846-93db-3a9b97244e3f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.122 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[686940e4-9fdd-40b9-881a-d7a42e67de5f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 NetworkManager[55671]: <info>  [1764720846.1264] device (tapde4a7aac-87): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:14:06 compute-0 NetworkManager[55671]: <info>  [1764720846.1272] device (tapde4a7aac-87): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:14:06 compute-0 systemd-machined[153518]: New machine qemu-13-instance-00000016.
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.152 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[645da7eb-6ea3-4199-a931-599d06fb6860]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.159 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:06 compute-0 ovn_controller[95488]: 2025-12-03T00:14:06Z|00152|binding|INFO|Setting lport de4a7aac-87a1-4237-9c69-504ca4fa7d87 ovn-installed in OVS
Dec 03 00:14:06 compute-0 ovn_controller[95488]: 2025-12-03T00:14:06Z|00153|binding|INFO|Setting lport de4a7aac-87a1-4237-9c69-504ca4fa7d87 up in Southbound
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.166 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:06 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000016.
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.180 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[8791b55c-b541-479d-8070-4809253312d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.184 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[06deb0fd-e8c6-4b29-95c8-569e9cc6ca1e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 systemd-udevd[218013]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:14:06 compute-0 NetworkManager[55671]: <info>  [1764720846.1860] manager: (tapee60e03c-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/66)
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.222 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d53370-cd1a-49fc-a208-cc4029d6f911]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.226 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[05f766d5-cb55-470c-a29f-db31e0964aaa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 NetworkManager[55671]: <info>  [1764720846.2577] device (tapee60e03c-a0): carrier: link connected
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.265 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[ae1a834d-3b5f-4b17-ade7-d70e556e3830]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.283 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[960a8504-bf07-4446-b182-054c6cadec94]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493690, 'reachable_time': 17216, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218041, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.303 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[601f1416-a640-44b2-88e1-e057feb46cb5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe99:2bfe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493690, 'tstamp': 493690}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218042, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.327 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fbba71fa-9754-421c-80c8-71c841c1479c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493690, 'reachable_time': 17216, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218043, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.367 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[418a3b6f-5947-4acd-b5a9-ac738039cc5e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.459 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[83954529-92e5-4c2b-bafb-a8254a08812e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.460 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.460 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.460 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee60e03c-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.462 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:06 compute-0 NetworkManager[55671]: <info>  [1764720846.4632] manager: (tapee60e03c-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Dec 03 00:14:06 compute-0 kernel: tapee60e03c-a0: entered promiscuous mode
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.466 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee60e03c-a0, col_values=(('external_ids', {'iface-id': '42f0d9e7-7c77-4247-8972-6beac3a53206'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:06 compute-0 ovn_controller[95488]: 2025-12-03T00:14:06Z|00154|binding|INFO|Releasing lport 42f0d9e7-7c77-4247-8972-6beac3a53206 from this chassis (sb_readonly=0)
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.468 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.470 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4672770a-3759-43be-b7db-12ee25daccef]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.470 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.471 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.471 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for ee60e03c-ab3a-419f-84ef-62aec4b6b0dd disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.471 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.471 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[08b67f7a-98e4-46dd-a1db-abbc4cd906a4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.472 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.472 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a38827ef-3915-4f5a-9e12-ae31223f2907]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.472 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: global
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: defaults
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     log global
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:14:06 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:06.473 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'env', 'PROCESS_TAG=haproxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.488 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.865 187247 DEBUG nova.compute.manager [req-571eb2bb-cfb2-40ea-a713-f5a110d15f1f req-26b5fbd3-e6b0-4c84-972c-a6eaed4897e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.866 187247 DEBUG oslo_concurrency.lockutils [req-571eb2bb-cfb2-40ea-a713-f5a110d15f1f req-26b5fbd3-e6b0-4c84-972c-a6eaed4897e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.866 187247 DEBUG oslo_concurrency.lockutils [req-571eb2bb-cfb2-40ea-a713-f5a110d15f1f req-26b5fbd3-e6b0-4c84-972c-a6eaed4897e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.866 187247 DEBUG oslo_concurrency.lockutils [req-571eb2bb-cfb2-40ea-a713-f5a110d15f1f req-26b5fbd3-e6b0-4c84-972c-a6eaed4897e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.867 187247 DEBUG nova.compute.manager [req-571eb2bb-cfb2-40ea-a713-f5a110d15f1f req-26b5fbd3-e6b0-4c84-972c-a6eaed4897e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Processing event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.868 187247 DEBUG nova.compute.manager [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.873 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.877 187247 INFO nova.virt.libvirt.driver [-] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Instance spawned successfully.
Dec 03 00:14:06 compute-0 nova_compute[187243]: 2025-12-03 00:14:06.878 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:14:06 compute-0 podman[218082]: 2025-12-03 00:14:06.851412528 +0000 UTC m=+0.030263631 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:14:07 compute-0 podman[218082]: 2025-12-03 00:14:07.157574849 +0000 UTC m=+0.336425902 container create 4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:14:07 compute-0 systemd[1]: Started libpod-conmon-4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa.scope.
Dec 03 00:14:07 compute-0 systemd[1]: Started libcrun container.
Dec 03 00:14:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03282525b8458b7616556fa160bc3ab7c0bd11209abd0a094cc884ef948d3672/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:14:07 compute-0 podman[218082]: 2025-12-03 00:14:07.247585418 +0000 UTC m=+0.426436501 container init 4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:14:07 compute-0 podman[218082]: 2025-12-03 00:14:07.256289444 +0000 UTC m=+0.435140497 container start 4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:14:07 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218097]: [NOTICE]   (218101) : New worker (218103) forked
Dec 03 00:14:07 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218097]: [NOTICE]   (218101) : Loading success.
Dec 03 00:14:07 compute-0 nova_compute[187243]: 2025-12-03 00:14:07.389 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:14:07 compute-0 nova_compute[187243]: 2025-12-03 00:14:07.390 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:14:07 compute-0 nova_compute[187243]: 2025-12-03 00:14:07.391 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:14:07 compute-0 nova_compute[187243]: 2025-12-03 00:14:07.391 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:14:07 compute-0 nova_compute[187243]: 2025-12-03 00:14:07.392 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:14:07 compute-0 nova_compute[187243]: 2025-12-03 00:14:07.392 187247 DEBUG nova.virt.libvirt.driver [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:14:08 compute-0 nova_compute[187243]: 2025-12-03 00:14:08.601 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:08 compute-0 nova_compute[187243]: 2025-12-03 00:14:08.759 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:09 compute-0 nova_compute[187243]: 2025-12-03 00:14:09.862 187247 DEBUG nova.compute.manager [req-e5a261af-e9ea-4390-a0e4-75b9be6a9313 req-75e7c93e-d0b4-4b7d-a967-abd44863e538 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:14:09 compute-0 nova_compute[187243]: 2025-12-03 00:14:09.863 187247 DEBUG oslo_concurrency.lockutils [req-e5a261af-e9ea-4390-a0e4-75b9be6a9313 req-75e7c93e-d0b4-4b7d-a967-abd44863e538 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:09 compute-0 nova_compute[187243]: 2025-12-03 00:14:09.863 187247 DEBUG oslo_concurrency.lockutils [req-e5a261af-e9ea-4390-a0e4-75b9be6a9313 req-75e7c93e-d0b4-4b7d-a967-abd44863e538 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:09 compute-0 nova_compute[187243]: 2025-12-03 00:14:09.863 187247 DEBUG oslo_concurrency.lockutils [req-e5a261af-e9ea-4390-a0e4-75b9be6a9313 req-75e7c93e-d0b4-4b7d-a967-abd44863e538 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:09 compute-0 nova_compute[187243]: 2025-12-03 00:14:09.864 187247 DEBUG nova.compute.manager [req-e5a261af-e9ea-4390-a0e4-75b9be6a9313 req-75e7c93e-d0b4-4b7d-a967-abd44863e538 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] No waiting events found dispatching network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:14:09 compute-0 nova_compute[187243]: 2025-12-03 00:14:09.864 187247 WARNING nova.compute.manager [req-e5a261af-e9ea-4390-a0e4-75b9be6a9313 req-75e7c93e-d0b4-4b7d-a967-abd44863e538 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received unexpected event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 for instance with vm_state building and task_state spawning.
Dec 03 00:14:10 compute-0 nova_compute[187243]: 2025-12-03 00:14:10.335 187247 INFO nova.compute.manager [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Took 13.39 seconds to spawn the instance on the hypervisor.
Dec 03 00:14:10 compute-0 nova_compute[187243]: 2025-12-03 00:14:10.337 187247 DEBUG nova.compute.manager [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:14:10 compute-0 nova_compute[187243]: 2025-12-03 00:14:10.867 187247 INFO nova.compute.manager [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Took 20.35 seconds to build instance.
Dec 03 00:14:11 compute-0 nova_compute[187243]: 2025-12-03 00:14:11.373 187247 DEBUG oslo_concurrency.lockutils [None req-401cfcf8-8581-4985-9bfc-2584014275f3 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.873s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:11 compute-0 nova_compute[187243]: 2025-12-03 00:14:11.374 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 17.721s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:11 compute-0 nova_compute[187243]: 2025-12-03 00:14:11.883 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.509s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:13 compute-0 nova_compute[187243]: 2025-12-03 00:14:13.604 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:13 compute-0 nova_compute[187243]: 2025-12-03 00:14:13.761 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:14 compute-0 podman[218112]: 2025-12-03 00:14:14.120509697 +0000 UTC m=+0.079689874 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64)
Dec 03 00:14:15 compute-0 nova_compute[187243]: 2025-12-03 00:14:15.055 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "972caa01-bec4-48a2-99f2-51a323f96e88" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:15 compute-0 nova_compute[187243]: 2025-12-03 00:14:15.055 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:15 compute-0 nova_compute[187243]: 2025-12-03 00:14:15.562 187247 DEBUG nova.compute.manager [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:14:16 compute-0 nova_compute[187243]: 2025-12-03 00:14:16.105 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:16 compute-0 nova_compute[187243]: 2025-12-03 00:14:16.105 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:16 compute-0 nova_compute[187243]: 2025-12-03 00:14:16.112 187247 DEBUG nova.virt.hardware [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:14:16 compute-0 nova_compute[187243]: 2025-12-03 00:14:16.112 187247 INFO nova.compute.claims [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:14:17 compute-0 podman[218134]: 2025-12-03 00:14:17.127664572 +0000 UTC m=+0.084394531 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:14:17 compute-0 nova_compute[187243]: 2025-12-03 00:14:17.183 187247 DEBUG nova.compute.provider_tree [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:14:17 compute-0 nova_compute[187243]: 2025-12-03 00:14:17.697 187247 DEBUG nova.scheduler.client.report [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:14:18 compute-0 nova_compute[187243]: 2025-12-03 00:14:18.208 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:18 compute-0 nova_compute[187243]: 2025-12-03 00:14:18.209 187247 DEBUG nova.compute.manager [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:14:18 compute-0 nova_compute[187243]: 2025-12-03 00:14:18.606 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:18 compute-0 nova_compute[187243]: 2025-12-03 00:14:18.718 187247 DEBUG nova.compute.manager [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:14:18 compute-0 nova_compute[187243]: 2025-12-03 00:14:18.719 187247 DEBUG nova.network.neutron [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:14:18 compute-0 nova_compute[187243]: 2025-12-03 00:14:18.719 187247 WARNING neutronclient.v2_0.client [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:14:18 compute-0 nova_compute[187243]: 2025-12-03 00:14:18.720 187247 WARNING neutronclient.v2_0.client [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:14:18 compute-0 ovn_controller[95488]: 2025-12-03T00:14:18Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:da:29:37 10.100.0.8
Dec 03 00:14:18 compute-0 ovn_controller[95488]: 2025-12-03T00:14:18Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:da:29:37 10.100.0.8
Dec 03 00:14:18 compute-0 nova_compute[187243]: 2025-12-03 00:14:18.823 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:19 compute-0 nova_compute[187243]: 2025-12-03 00:14:19.227 187247 INFO nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:14:19 compute-0 nova_compute[187243]: 2025-12-03 00:14:19.253 187247 DEBUG nova.network.neutron [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Successfully created port: 062f9630-4130-4b89-ad40-cd81d67fc31b _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:14:19 compute-0 nova_compute[187243]: 2025-12-03 00:14:19.743 187247 DEBUG nova.compute.manager [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:14:19 compute-0 nova_compute[187243]: 2025-12-03 00:14:19.751 187247 DEBUG nova.network.neutron [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Successfully updated port: 062f9630-4130-4b89-ad40-cd81d67fc31b _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:14:19 compute-0 nova_compute[187243]: 2025-12-03 00:14:19.803 187247 DEBUG nova.compute.manager [req-18acb4c0-1e82-4f0c-b5d7-605e40561163 req-33360744-6336-42e4-841d-bfb26614cd6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Received event network-changed-062f9630-4130-4b89-ad40-cd81d67fc31b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:14:19 compute-0 nova_compute[187243]: 2025-12-03 00:14:19.803 187247 DEBUG nova.compute.manager [req-18acb4c0-1e82-4f0c-b5d7-605e40561163 req-33360744-6336-42e4-841d-bfb26614cd6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Refreshing instance network info cache due to event network-changed-062f9630-4130-4b89-ad40-cd81d67fc31b. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:14:19 compute-0 nova_compute[187243]: 2025-12-03 00:14:19.804 187247 DEBUG oslo_concurrency.lockutils [req-18acb4c0-1e82-4f0c-b5d7-605e40561163 req-33360744-6336-42e4-841d-bfb26614cd6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-972caa01-bec4-48a2-99f2-51a323f96e88" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:14:19 compute-0 nova_compute[187243]: 2025-12-03 00:14:19.804 187247 DEBUG oslo_concurrency.lockutils [req-18acb4c0-1e82-4f0c-b5d7-605e40561163 req-33360744-6336-42e4-841d-bfb26614cd6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-972caa01-bec4-48a2-99f2-51a323f96e88" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:14:19 compute-0 nova_compute[187243]: 2025-12-03 00:14:19.804 187247 DEBUG nova.network.neutron [req-18acb4c0-1e82-4f0c-b5d7-605e40561163 req-33360744-6336-42e4-841d-bfb26614cd6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Refreshing network info cache for port 062f9630-4130-4b89-ad40-cd81d67fc31b _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.256 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "refresh_cache-972caa01-bec4-48a2-99f2-51a323f96e88" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.308 187247 WARNING neutronclient.v2_0.client [req-18acb4c0-1e82-4f0c-b5d7-605e40561163 req-33360744-6336-42e4-841d-bfb26614cd6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.762 187247 DEBUG nova.compute.manager [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.763 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.764 187247 INFO nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Creating image(s)
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.765 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "/var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.765 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "/var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.766 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "/var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.767 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.773 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.775 187247 DEBUG oslo_concurrency.processutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.789 187247 DEBUG nova.network.neutron [req-18acb4c0-1e82-4f0c-b5d7-605e40561163 req-33360744-6336-42e4-841d-bfb26614cd6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.855 187247 DEBUG oslo_concurrency.processutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.857 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.858 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.859 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.866 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.867 187247 DEBUG oslo_concurrency.processutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.928 187247 DEBUG oslo_concurrency.processutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.928 187247 DEBUG oslo_concurrency.processutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:20 compute-0 nova_compute[187243]: 2025-12-03 00:14:20.940 187247 DEBUG nova.network.neutron [req-18acb4c0-1e82-4f0c-b5d7-605e40561163 req-33360744-6336-42e4-841d-bfb26614cd6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:14:21 compute-0 nova_compute[187243]: 2025-12-03 00:14:21.449 187247 DEBUG oslo_concurrency.lockutils [req-18acb4c0-1e82-4f0c-b5d7-605e40561163 req-33360744-6336-42e4-841d-bfb26614cd6a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-972caa01-bec4-48a2-99f2-51a323f96e88" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:14:21 compute-0 nova_compute[187243]: 2025-12-03 00:14:21.450 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquired lock "refresh_cache-972caa01-bec4-48a2-99f2-51a323f96e88" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:14:21 compute-0 nova_compute[187243]: 2025-12-03 00:14:21.451 187247 DEBUG nova.network.neutron [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:14:21 compute-0 nova_compute[187243]: 2025-12-03 00:14:21.958 187247 DEBUG oslo_concurrency.processutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk 1073741824" returned: 0 in 1.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:21 compute-0 nova_compute[187243]: 2025-12-03 00:14:21.959 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:21 compute-0 nova_compute[187243]: 2025-12-03 00:14:21.960 187247 DEBUG oslo_concurrency.processutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:22 compute-0 nova_compute[187243]: 2025-12-03 00:14:22.017 187247 DEBUG oslo_concurrency.processutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:22 compute-0 nova_compute[187243]: 2025-12-03 00:14:22.017 187247 DEBUG nova.virt.disk.api [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Checking if we can resize image /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:14:22 compute-0 nova_compute[187243]: 2025-12-03 00:14:22.018 187247 DEBUG oslo_concurrency.processutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:22 compute-0 nova_compute[187243]: 2025-12-03 00:14:22.073 187247 DEBUG oslo_concurrency.processutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:22 compute-0 nova_compute[187243]: 2025-12-03 00:14:22.074 187247 DEBUG nova.virt.disk.api [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Cannot resize image /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:14:22 compute-0 nova_compute[187243]: 2025-12-03 00:14:22.074 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:14:22 compute-0 nova_compute[187243]: 2025-12-03 00:14:22.075 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Ensure instance console log exists: /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:14:22 compute-0 nova_compute[187243]: 2025-12-03 00:14:22.075 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:22 compute-0 nova_compute[187243]: 2025-12-03 00:14:22.075 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:22 compute-0 nova_compute[187243]: 2025-12-03 00:14:22.076 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:22 compute-0 nova_compute[187243]: 2025-12-03 00:14:22.797 187247 DEBUG nova.network.neutron [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:14:23 compute-0 nova_compute[187243]: 2025-12-03 00:14:23.058 187247 WARNING neutronclient.v2_0.client [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:14:23 compute-0 nova_compute[187243]: 2025-12-03 00:14:23.607 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:23 compute-0 nova_compute[187243]: 2025-12-03 00:14:23.825 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:23 compute-0 nova_compute[187243]: 2025-12-03 00:14:23.894 187247 DEBUG nova.network.neutron [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Updating instance_info_cache with network_info: [{"id": "062f9630-4130-4b89-ad40-cd81d67fc31b", "address": "fa:16:3e:97:78:92", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062f9630-41", "ovs_interfaceid": "062f9630-4130-4b89-ad40-cd81d67fc31b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.405 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Releasing lock "refresh_cache-972caa01-bec4-48a2-99f2-51a323f96e88" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.405 187247 DEBUG nova.compute.manager [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Instance network_info: |[{"id": "062f9630-4130-4b89-ad40-cd81d67fc31b", "address": "fa:16:3e:97:78:92", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062f9630-41", "ovs_interfaceid": "062f9630-4130-4b89-ad40-cd81d67fc31b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.408 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Start _get_guest_xml network_info=[{"id": "062f9630-4130-4b89-ad40-cd81d67fc31b", "address": "fa:16:3e:97:78:92", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062f9630-41", "ovs_interfaceid": "062f9630-4130-4b89-ad40-cd81d67fc31b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.411 187247 WARNING nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.413 187247 DEBUG nova.virt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalanceStrategy-server-1091845202', uuid='972caa01-bec4-48a2-99f2-51a323f96e88'), owner=OwnerMeta(userid='0473307cd38b412cbfdbd093053eb1af', username='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin', projectid='e510a0888b4c4fb5860a0f1720b8ed4b', projectname='tempest-TestExecuteWorkloadBalanceStrategy-1290727110'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "062f9630-4130-4b89-ad40-cd81d67fc31b", "address": "fa:16:3e:97:78:92", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062f9630-41", "ovs_interfaceid": "062f9630-4130-4b89-ad40-cd81d67fc31b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720864.4129066) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.463 187247 DEBUG nova.virt.libvirt.host [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.464 187247 DEBUG nova.virt.libvirt.host [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.486 187247 DEBUG nova.virt.libvirt.host [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.486 187247 DEBUG nova.virt.libvirt.host [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.487 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.488 187247 DEBUG nova.virt.hardware [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.488 187247 DEBUG nova.virt.hardware [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.488 187247 DEBUG nova.virt.hardware [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.489 187247 DEBUG nova.virt.hardware [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.489 187247 DEBUG nova.virt.hardware [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.489 187247 DEBUG nova.virt.hardware [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.489 187247 DEBUG nova.virt.hardware [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.490 187247 DEBUG nova.virt.hardware [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.490 187247 DEBUG nova.virt.hardware [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.490 187247 DEBUG nova.virt.hardware [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.490 187247 DEBUG nova.virt.hardware [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.494 187247 DEBUG nova.virt.libvirt.vif [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:14:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1091845202',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1091845202',id=23,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-gz4xevxl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:14:19Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=972caa01-bec4-48a2-99f2-51a323f96e88,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "062f9630-4130-4b89-ad40-cd81d67fc31b", "address": "fa:16:3e:97:78:92", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062f9630-41", "ovs_interfaceid": "062f9630-4130-4b89-ad40-cd81d67fc31b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.494 187247 DEBUG nova.network.os_vif_util [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converting VIF {"id": "062f9630-4130-4b89-ad40-cd81d67fc31b", "address": "fa:16:3e:97:78:92", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062f9630-41", "ovs_interfaceid": "062f9630-4130-4b89-ad40-cd81d67fc31b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.495 187247 DEBUG nova.network.os_vif_util [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:78:92,bridge_name='br-int',has_traffic_filtering=True,id=062f9630-4130-4b89-ad40-cd81d67fc31b,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062f9630-41') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:14:24 compute-0 nova_compute[187243]: 2025-12-03 00:14:24.496 187247 DEBUG nova.objects.instance [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lazy-loading 'pci_devices' on Instance uuid 972caa01-bec4-48a2-99f2-51a323f96e88 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.007 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:14:25 compute-0 nova_compute[187243]:   <uuid>972caa01-bec4-48a2-99f2-51a323f96e88</uuid>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   <name>instance-00000017</name>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1091845202</nova:name>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:14:24</nova:creationTime>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:14:25 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:14:25 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:user uuid="0473307cd38b412cbfdbd093053eb1af">tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin</nova:user>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:project uuid="e510a0888b4c4fb5860a0f1720b8ed4b">tempest-TestExecuteWorkloadBalanceStrategy-1290727110</nova:project>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         <nova:port uuid="062f9630-4130-4b89-ad40-cd81d67fc31b">
Dec 03 00:14:25 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <system>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <entry name="serial">972caa01-bec4-48a2-99f2-51a323f96e88</entry>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <entry name="uuid">972caa01-bec4-48a2-99f2-51a323f96e88</entry>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     </system>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   <os>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   </os>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   <features>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   </features>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk.config"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:97:78:92"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <target dev="tap062f9630-41"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/console.log" append="off"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <video>
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     </video>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:14:25 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:14:25 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:14:25 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:14:25 compute-0 nova_compute[187243]: </domain>
Dec 03 00:14:25 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.009 187247 DEBUG nova.compute.manager [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Preparing to wait for external event network-vif-plugged-062f9630-4130-4b89-ad40-cd81d67fc31b prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.009 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.009 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.010 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.010 187247 DEBUG nova.virt.libvirt.vif [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:14:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1091845202',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1091845202',id=23,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-gz4xevxl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:14:19Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=972caa01-bec4-48a2-99f2-51a323f96e88,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "062f9630-4130-4b89-ad40-cd81d67fc31b", "address": "fa:16:3e:97:78:92", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062f9630-41", "ovs_interfaceid": "062f9630-4130-4b89-ad40-cd81d67fc31b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.011 187247 DEBUG nova.network.os_vif_util [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converting VIF {"id": "062f9630-4130-4b89-ad40-cd81d67fc31b", "address": "fa:16:3e:97:78:92", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062f9630-41", "ovs_interfaceid": "062f9630-4130-4b89-ad40-cd81d67fc31b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.011 187247 DEBUG nova.network.os_vif_util [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:78:92,bridge_name='br-int',has_traffic_filtering=True,id=062f9630-4130-4b89-ad40-cd81d67fc31b,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062f9630-41') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.012 187247 DEBUG os_vif [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:78:92,bridge_name='br-int',has_traffic_filtering=True,id=062f9630-4130-4b89-ad40-cd81d67fc31b,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062f9630-41') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.012 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.013 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.013 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.014 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.014 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9e69a31c-abf2-5270-8960-46734516162d', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.016 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.020 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.021 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap062f9630-41, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.021 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap062f9630-41, col_values=(('qos', UUID('d8354bac-4fca-4e7f-8272-a48aa3307785')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.022 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap062f9630-41, col_values=(('external_ids', {'iface-id': '062f9630-4130-4b89-ad40-cd81d67fc31b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:78:92', 'vm-uuid': '972caa01-bec4-48a2-99f2-51a323f96e88'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.023 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:25 compute-0 NetworkManager[55671]: <info>  [1764720865.0244] manager: (tap062f9630-41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.026 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.031 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:25 compute-0 nova_compute[187243]: 2025-12-03 00:14:25.032 187247 INFO os_vif [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:78:92,bridge_name='br-int',has_traffic_filtering=True,id=062f9630-4130-4b89-ad40-cd81d67fc31b,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062f9630-41')
Dec 03 00:14:26 compute-0 nova_compute[187243]: 2025-12-03 00:14:26.577 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:14:26 compute-0 nova_compute[187243]: 2025-12-03 00:14:26.578 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:14:26 compute-0 nova_compute[187243]: 2025-12-03 00:14:26.578 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] No VIF found with MAC fa:16:3e:97:78:92, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:14:26 compute-0 nova_compute[187243]: 2025-12-03 00:14:26.579 187247 INFO nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Using config drive
Dec 03 00:14:27 compute-0 nova_compute[187243]: 2025-12-03 00:14:27.089 187247 WARNING neutronclient.v2_0.client [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:14:27 compute-0 podman[218192]: 2025-12-03 00:14:27.099530589 +0000 UTC m=+0.053791033 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:14:27 compute-0 nova_compute[187243]: 2025-12-03 00:14:27.902 187247 INFO nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Creating config drive at /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk.config
Dec 03 00:14:27 compute-0 nova_compute[187243]: 2025-12-03 00:14:27.909 187247 DEBUG oslo_concurrency.processutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpoo0i69gn execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.033 187247 DEBUG oslo_concurrency.processutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpoo0i69gn" returned: 0 in 0.124s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:28 compute-0 kernel: tap062f9630-41: entered promiscuous mode
Dec 03 00:14:28 compute-0 NetworkManager[55671]: <info>  [1764720868.0916] manager: (tap062f9630-41): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Dec 03 00:14:28 compute-0 ovn_controller[95488]: 2025-12-03T00:14:28Z|00155|binding|INFO|Claiming lport 062f9630-4130-4b89-ad40-cd81d67fc31b for this chassis.
Dec 03 00:14:28 compute-0 ovn_controller[95488]: 2025-12-03T00:14:28Z|00156|binding|INFO|062f9630-4130-4b89-ad40-cd81d67fc31b: Claiming fa:16:3e:97:78:92 10.100.0.4
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.093 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.099 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:78:92 10.100.0.4'], port_security=['fa:16:3e:97:78:92 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '972caa01-bec4-48a2-99f2-51a323f96e88', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=062f9630-4130-4b89-ad40-cd81d67fc31b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.100 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 062f9630-4130-4b89-ad40-cd81d67fc31b in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd bound to our chassis
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.102 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:14:28 compute-0 ovn_controller[95488]: 2025-12-03T00:14:28Z|00157|binding|INFO|Setting lport 062f9630-4130-4b89-ad40-cd81d67fc31b ovn-installed in OVS
Dec 03 00:14:28 compute-0 ovn_controller[95488]: 2025-12-03T00:14:28Z|00158|binding|INFO|Setting lport 062f9630-4130-4b89-ad40-cd81d67fc31b up in Southbound
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.108 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.111 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.119 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[08f2f5c5-ea60-4f74-a3f6-704ffd42ceea]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:28 compute-0 systemd-udevd[218231]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:14:28 compute-0 systemd-machined[153518]: New machine qemu-14-instance-00000017.
Dec 03 00:14:28 compute-0 NetworkManager[55671]: <info>  [1764720868.1376] device (tap062f9630-41): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:14:28 compute-0 NetworkManager[55671]: <info>  [1764720868.1386] device (tap062f9630-41): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:14:28 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000017.
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.151 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[f1a43a81-0279-4562-9c12-b4bb46e880bf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.154 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[09c7e49a-aff2-467d-8180-1140ca08c69b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.185 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[8e55f386-bcf8-408d-902d-780f37cf84f3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.201 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[564d6127-7b8e-49d1-8f15-5b6bf96dee8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493690, 'reachable_time': 17216, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218245, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.219 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ac0ac50f-9561-4def-bc7b-a2fad4a72895]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee60e03c-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493704, 'tstamp': 493704}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218247, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee60e03c-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493709, 'tstamp': 493709}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218247, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.220 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.221 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.222 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.222 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee60e03c-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.222 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.223 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee60e03c-a0, col_values=(('external_ids', {'iface-id': '42f0d9e7-7c77-4247-8972-6beac3a53206'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.223 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:14:28 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:14:28.224 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b8b51175-84e3-4f04-959a-89d8eca969fc]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ee60e03c-ab3a-419f-84ef-62aec4b6b0dd\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.826 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.965 187247 DEBUG nova.compute.manager [req-6f4982be-d33c-4d32-9d8f-ebfcd85f6096 req-e39cd9d2-9c03-47aa-91d6-ed0a6b32391b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Received event network-vif-plugged-062f9630-4130-4b89-ad40-cd81d67fc31b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.965 187247 DEBUG oslo_concurrency.lockutils [req-6f4982be-d33c-4d32-9d8f-ebfcd85f6096 req-e39cd9d2-9c03-47aa-91d6-ed0a6b32391b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.965 187247 DEBUG oslo_concurrency.lockutils [req-6f4982be-d33c-4d32-9d8f-ebfcd85f6096 req-e39cd9d2-9c03-47aa-91d6-ed0a6b32391b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.966 187247 DEBUG oslo_concurrency.lockutils [req-6f4982be-d33c-4d32-9d8f-ebfcd85f6096 req-e39cd9d2-9c03-47aa-91d6-ed0a6b32391b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.966 187247 DEBUG nova.compute.manager [req-6f4982be-d33c-4d32-9d8f-ebfcd85f6096 req-e39cd9d2-9c03-47aa-91d6-ed0a6b32391b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Processing event network-vif-plugged-062f9630-4130-4b89-ad40-cd81d67fc31b _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.967 187247 DEBUG nova.compute.manager [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.971 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.975 187247 INFO nova.virt.libvirt.driver [-] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Instance spawned successfully.
Dec 03 00:14:28 compute-0 nova_compute[187243]: 2025-12-03 00:14:28.975 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:14:29 compute-0 nova_compute[187243]: 2025-12-03 00:14:29.487 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:14:29 compute-0 nova_compute[187243]: 2025-12-03 00:14:29.487 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:14:29 compute-0 nova_compute[187243]: 2025-12-03 00:14:29.488 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:14:29 compute-0 nova_compute[187243]: 2025-12-03 00:14:29.488 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:14:29 compute-0 nova_compute[187243]: 2025-12-03 00:14:29.489 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:14:29 compute-0 nova_compute[187243]: 2025-12-03 00:14:29.489 187247 DEBUG nova.virt.libvirt.driver [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:14:29 compute-0 podman[197600]: time="2025-12-03T00:14:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:14:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:14:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:14:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:14:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3061 "" "Go-http-client/1.1"
Dec 03 00:14:30 compute-0 nova_compute[187243]: 2025-12-03 00:14:29.999 187247 INFO nova.compute.manager [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Took 9.24 seconds to spawn the instance on the hypervisor.
Dec 03 00:14:30 compute-0 nova_compute[187243]: 2025-12-03 00:14:30.000 187247 DEBUG nova.compute.manager [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:14:30 compute-0 nova_compute[187243]: 2025-12-03 00:14:30.024 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:30 compute-0 podman[218255]: 2025-12-03 00:14:30.103501776 +0000 UTC m=+0.057006793 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 03 00:14:30 compute-0 podman[218256]: 2025-12-03 00:14:30.137088217 +0000 UTC m=+0.090957633 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.4)
Dec 03 00:14:30 compute-0 nova_compute[187243]: 2025-12-03 00:14:30.533 187247 INFO nova.compute.manager [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Took 14.46 seconds to build instance.
Dec 03 00:14:31 compute-0 nova_compute[187243]: 2025-12-03 00:14:31.019 187247 DEBUG nova.compute.manager [req-78b03146-bad2-4981-a25f-b25fd76f3220 req-76ea7898-03a9-4dfa-9710-f20210429c70 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Received event network-vif-plugged-062f9630-4130-4b89-ad40-cd81d67fc31b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:14:31 compute-0 nova_compute[187243]: 2025-12-03 00:14:31.019 187247 DEBUG oslo_concurrency.lockutils [req-78b03146-bad2-4981-a25f-b25fd76f3220 req-76ea7898-03a9-4dfa-9710-f20210429c70 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:31 compute-0 nova_compute[187243]: 2025-12-03 00:14:31.019 187247 DEBUG oslo_concurrency.lockutils [req-78b03146-bad2-4981-a25f-b25fd76f3220 req-76ea7898-03a9-4dfa-9710-f20210429c70 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:31 compute-0 nova_compute[187243]: 2025-12-03 00:14:31.020 187247 DEBUG oslo_concurrency.lockutils [req-78b03146-bad2-4981-a25f-b25fd76f3220 req-76ea7898-03a9-4dfa-9710-f20210429c70 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:31 compute-0 nova_compute[187243]: 2025-12-03 00:14:31.020 187247 DEBUG nova.compute.manager [req-78b03146-bad2-4981-a25f-b25fd76f3220 req-76ea7898-03a9-4dfa-9710-f20210429c70 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] No waiting events found dispatching network-vif-plugged-062f9630-4130-4b89-ad40-cd81d67fc31b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:14:31 compute-0 nova_compute[187243]: 2025-12-03 00:14:31.020 187247 WARNING nova.compute.manager [req-78b03146-bad2-4981-a25f-b25fd76f3220 req-76ea7898-03a9-4dfa-9710-f20210429c70 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Received unexpected event network-vif-plugged-062f9630-4130-4b89-ad40-cd81d67fc31b for instance with vm_state active and task_state None.
Dec 03 00:14:31 compute-0 nova_compute[187243]: 2025-12-03 00:14:31.038 187247 DEBUG oslo_concurrency.lockutils [None req-07ec2623-6fe6-4e3a-a675-8c9c84b9de8e 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.982s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:31 compute-0 openstack_network_exporter[199746]: ERROR   00:14:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:14:31 compute-0 openstack_network_exporter[199746]: ERROR   00:14:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:14:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:14:31 compute-0 openstack_network_exporter[199746]: ERROR   00:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:14:31 compute-0 openstack_network_exporter[199746]: ERROR   00:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:14:31 compute-0 openstack_network_exporter[199746]: ERROR   00:14:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:14:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:14:33 compute-0 nova_compute[187243]: 2025-12-03 00:14:33.859 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:35 compute-0 nova_compute[187243]: 2025-12-03 00:14:35.067 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:38 compute-0 nova_compute[187243]: 2025-12-03 00:14:38.860 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:40 compute-0 nova_compute[187243]: 2025-12-03 00:14:40.070 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:40 compute-0 ovn_controller[95488]: 2025-12-03T00:14:40Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:97:78:92 10.100.0.4
Dec 03 00:14:40 compute-0 ovn_controller[95488]: 2025-12-03T00:14:40Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:97:78:92 10.100.0.4
Dec 03 00:14:43 compute-0 nova_compute[187243]: 2025-12-03 00:14:43.108 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:43 compute-0 nova_compute[187243]: 2025-12-03 00:14:43.108 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:43 compute-0 nova_compute[187243]: 2025-12-03 00:14:43.899 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:45 compute-0 nova_compute[187243]: 2025-12-03 00:14:45.072 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:45 compute-0 podman[218309]: 2025-12-03 00:14:45.125170229 +0000 UTC m=+0.086699628 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 03 00:14:45 compute-0 nova_compute[187243]: 2025-12-03 00:14:45.940 187247 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Check if temp file /var/lib/nova/instances/tmp763x_1ui exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:14:45 compute-0 nova_compute[187243]: 2025-12-03 00:14:45.944 187247 DEBUG nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp763x_1ui',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='799be56b-eb56-4319-a027-b0fe2cf7991f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:14:46 compute-0 nova_compute[187243]: 2025-12-03 00:14:46.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:46 compute-0 nova_compute[187243]: 2025-12-03 00:14:46.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:14:47 compute-0 nova_compute[187243]: 2025-12-03 00:14:47.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:48 compute-0 podman[218331]: 2025-12-03 00:14:48.133481453 +0000 UTC m=+0.072565628 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:14:48 compute-0 nova_compute[187243]: 2025-12-03 00:14:48.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:48 compute-0 nova_compute[187243]: 2025-12-03 00:14:48.902 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:49 compute-0 nova_compute[187243]: 2025-12-03 00:14:49.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:49 compute-0 nova_compute[187243]: 2025-12-03 00:14:49.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:49 compute-0 nova_compute[187243]: 2025-12-03 00:14:49.106 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:49 compute-0 nova_compute[187243]: 2025-12-03 00:14:49.106 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.125 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.151 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.203 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.204 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.254 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.259 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.310 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.311 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.363 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.498 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.500 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.516 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.517 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5494MB free_disk=73.1049690246582GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.517 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:50 compute-0 nova_compute[187243]: 2025-12-03 00:14:50.517 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:51 compute-0 nova_compute[187243]: 2025-12-03 00:14:51.537 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Updating resource usage from migration 9bbb892b-e56f-461e-a797-4ab09a78db13
Dec 03 00:14:51 compute-0 nova_compute[187243]: 2025-12-03 00:14:51.573 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance 972caa01-bec4-48a2-99f2-51a323f96e88 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:14:51 compute-0 nova_compute[187243]: 2025-12-03 00:14:51.574 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration 9bbb892b-e56f-461e-a797-4ab09a78db13 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:14:51 compute-0 nova_compute[187243]: 2025-12-03 00:14:51.574 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:14:51 compute-0 nova_compute[187243]: 2025-12-03 00:14:51.575 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:14:50 up  1:23,  0 user,  load average: 0.64, 0.30, 0.30\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_migrating': '1', 'num_os_type_None': '2', 'num_proj_e510a0888b4c4fb5860a0f1720b8ed4b': '2', 'io_workload': '0', 'num_task_None': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:14:51 compute-0 nova_compute[187243]: 2025-12-03 00:14:51.849 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:14:51 compute-0 nova_compute[187243]: 2025-12-03 00:14:51.916 187247 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:51 compute-0 nova_compute[187243]: 2025-12-03 00:14:51.983 187247 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:51 compute-0 nova_compute[187243]: 2025-12-03 00:14:51.985 187247 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:14:52 compute-0 nova_compute[187243]: 2025-12-03 00:14:52.078 187247 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:14:52 compute-0 nova_compute[187243]: 2025-12-03 00:14:52.081 187247 DEBUG nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Preparing to wait for external event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:14:52 compute-0 nova_compute[187243]: 2025-12-03 00:14:52.081 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:52 compute-0 nova_compute[187243]: 2025-12-03 00:14:52.082 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:52 compute-0 nova_compute[187243]: 2025-12-03 00:14:52.083 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:52 compute-0 nova_compute[187243]: 2025-12-03 00:14:52.358 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:14:52 compute-0 nova_compute[187243]: 2025-12-03 00:14:52.870 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:14:52 compute-0 nova_compute[187243]: 2025-12-03 00:14:52.871 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.353s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:53 compute-0 nova_compute[187243]: 2025-12-03 00:14:53.911 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:54 compute-0 nova_compute[187243]: 2025-12-03 00:14:54.870 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:54 compute-0 nova_compute[187243]: 2025-12-03 00:14:54.871 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:54 compute-0 nova_compute[187243]: 2025-12-03 00:14:54.872 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:14:55 compute-0 nova_compute[187243]: 2025-12-03 00:14:55.126 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:55 compute-0 sshd-session[218371]: Received disconnect from 20.123.120.169 port 51496:11: Bye Bye [preauth]
Dec 03 00:14:55 compute-0 sshd-session[218371]: Disconnected from authenticating user root 20.123.120.169 port 51496 [preauth]
Dec 03 00:14:58 compute-0 podman[218373]: 2025-12-03 00:14:58.095245327 +0000 UTC m=+0.052716216 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:14:58 compute-0 ovn_controller[95488]: 2025-12-03T00:14:58Z|00159|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 03 00:14:58 compute-0 nova_compute[187243]: 2025-12-03 00:14:58.125 187247 DEBUG nova.compute.manager [req-5f11c247-a4cc-429f-9c2d-9505a83650d5 req-a32d5125-f6f6-4dfa-a8a5-4b88f6773f2e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:14:58 compute-0 nova_compute[187243]: 2025-12-03 00:14:58.125 187247 DEBUG oslo_concurrency.lockutils [req-5f11c247-a4cc-429f-9c2d-9505a83650d5 req-a32d5125-f6f6-4dfa-a8a5-4b88f6773f2e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:14:58 compute-0 nova_compute[187243]: 2025-12-03 00:14:58.126 187247 DEBUG oslo_concurrency.lockutils [req-5f11c247-a4cc-429f-9c2d-9505a83650d5 req-a32d5125-f6f6-4dfa-a8a5-4b88f6773f2e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:14:58 compute-0 nova_compute[187243]: 2025-12-03 00:14:58.126 187247 DEBUG oslo_concurrency.lockutils [req-5f11c247-a4cc-429f-9c2d-9505a83650d5 req-a32d5125-f6f6-4dfa-a8a5-4b88f6773f2e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:14:58 compute-0 nova_compute[187243]: 2025-12-03 00:14:58.126 187247 DEBUG nova.compute.manager [req-5f11c247-a4cc-429f-9c2d-9505a83650d5 req-a32d5125-f6f6-4dfa-a8a5-4b88f6773f2e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] No event matching network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 in dict_keys([('network-vif-plugged', 'de4a7aac-87a1-4237-9c69-504ca4fa7d87')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:14:58 compute-0 nova_compute[187243]: 2025-12-03 00:14:58.126 187247 DEBUG nova.compute.manager [req-5f11c247-a4cc-429f-9c2d-9505a83650d5 req-a32d5125-f6f6-4dfa-a8a5-4b88f6773f2e 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:14:58 compute-0 nova_compute[187243]: 2025-12-03 00:14:58.918 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:14:59 compute-0 nova_compute[187243]: 2025-12-03 00:14:59.119 187247 INFO nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Took 7.04 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 03 00:14:59 compute-0 podman[197600]: time="2025-12-03T00:14:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:14:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:14:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:14:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:14:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3061 "" "Go-http-client/1.1"
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.130 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.225 187247 DEBUG nova.compute.manager [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.226 187247 DEBUG oslo_concurrency.lockutils [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.226 187247 DEBUG oslo_concurrency.lockutils [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.226 187247 DEBUG oslo_concurrency.lockutils [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.227 187247 DEBUG nova.compute.manager [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Processing event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.227 187247 DEBUG nova.compute.manager [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-changed-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.227 187247 DEBUG nova.compute.manager [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Refreshing instance network info cache due to event network-changed-de4a7aac-87a1-4237-9c69-504ca4fa7d87. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.227 187247 DEBUG oslo_concurrency.lockutils [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.227 187247 DEBUG oslo_concurrency.lockutils [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.228 187247 DEBUG nova.network.neutron [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Refreshing network info cache for port de4a7aac-87a1-4237-9c69-504ca4fa7d87 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.229 187247 DEBUG nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:15:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:00.710 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:00.710 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:00.711 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.735 187247 WARNING neutronclient.v2_0.client [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:00 compute-0 nova_compute[187243]: 2025-12-03 00:15:00.738 187247 DEBUG nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp763x_1ui',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='799be56b-eb56-4319-a027-b0fe2cf7991f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(9bbb892b-e56f-461e-a797-4ab09a78db13),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:15:01 compute-0 podman[218399]: 2025-12-03 00:15:01.093466121 +0000 UTC m=+0.049494077 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 03 00:15:01 compute-0 podman[218400]: 2025-12-03 00:15:01.120411538 +0000 UTC m=+0.073311136 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.140 187247 WARNING neutronclient.v2_0.client [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.256 187247 DEBUG nova.objects.instance [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 799be56b-eb56-4319-a027-b0fe2cf7991f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.258 187247 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.261 187247 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.261 187247 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:15:01 compute-0 openstack_network_exporter[199746]: ERROR   00:15:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:15:01 compute-0 openstack_network_exporter[199746]: ERROR   00:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:15:01 compute-0 openstack_network_exporter[199746]: ERROR   00:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:15:01 compute-0 openstack_network_exporter[199746]: ERROR   00:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:15:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:15:01 compute-0 openstack_network_exporter[199746]: ERROR   00:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:15:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.763 187247 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.763 187247 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.770 187247 DEBUG nova.virt.libvirt.vif [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:13:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1118620355',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1118620355',id=22,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:14:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-103ztj3x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:14:10Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=799be56b-eb56-4319-a027-b0fe2cf7991f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.771 187247 DEBUG nova.network.os_vif_util [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.771 187247 DEBUG nova.network.os_vif_util [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.772 187247 DEBUG nova.virt.libvirt.migration [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:da:29:37"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <target dev="tapde4a7aac-87"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]: </interface>
Dec 03 00:15:01 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.772 187247 DEBUG nova.virt.libvirt.migration [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <name>instance-00000016</name>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <uuid>799be56b-eb56-4319-a027-b0fe2cf7991f</uuid>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1118620355</nova:name>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:14:02</nova:creationTime>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:15:01 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:15:01 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:user uuid="0473307cd38b412cbfdbd093053eb1af">tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin</nova:user>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:project uuid="e510a0888b4c4fb5860a0f1720b8ed4b">tempest-TestExecuteWorkloadBalanceStrategy-1290727110</nova:project>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:port uuid="de4a7aac-87a1-4237-9c69-504ca4fa7d87">
Dec 03 00:15:01 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <system>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="serial">799be56b-eb56-4319-a027-b0fe2cf7991f</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="uuid">799be56b-eb56-4319-a027-b0fe2cf7991f</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </system>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <os>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </os>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <features>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </features>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk.config"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:da:29:37"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapde4a7aac-87"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/console.log" append="off"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </target>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/console.log" append="off"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </console>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </input>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <video>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </video>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]: </domain>
Dec 03 00:15:01 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.773 187247 DEBUG nova.virt.libvirt.migration [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <name>instance-00000016</name>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <uuid>799be56b-eb56-4319-a027-b0fe2cf7991f</uuid>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1118620355</nova:name>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:14:02</nova:creationTime>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:15:01 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:15:01 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:user uuid="0473307cd38b412cbfdbd093053eb1af">tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin</nova:user>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:project uuid="e510a0888b4c4fb5860a0f1720b8ed4b">tempest-TestExecuteWorkloadBalanceStrategy-1290727110</nova:project>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:port uuid="de4a7aac-87a1-4237-9c69-504ca4fa7d87">
Dec 03 00:15:01 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <system>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="serial">799be56b-eb56-4319-a027-b0fe2cf7991f</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="uuid">799be56b-eb56-4319-a027-b0fe2cf7991f</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </system>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <os>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </os>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <features>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </features>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk.config"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:da:29:37"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapde4a7aac-87"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/console.log" append="off"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </target>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/console.log" append="off"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </console>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </input>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <video>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </video>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]: </domain>
Dec 03 00:15:01 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.774 187247 DEBUG nova.virt.libvirt.migration [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <name>instance-00000016</name>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <uuid>799be56b-eb56-4319-a027-b0fe2cf7991f</uuid>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1118620355</nova:name>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:14:02</nova:creationTime>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:15:01 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:15:01 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:user uuid="0473307cd38b412cbfdbd093053eb1af">tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin</nova:user>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:project uuid="e510a0888b4c4fb5860a0f1720b8ed4b">tempest-TestExecuteWorkloadBalanceStrategy-1290727110</nova:project>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <nova:port uuid="de4a7aac-87a1-4237-9c69-504ca4fa7d87">
Dec 03 00:15:01 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <system>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="serial">799be56b-eb56-4319-a027-b0fe2cf7991f</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="uuid">799be56b-eb56-4319-a027-b0fe2cf7991f</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </system>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <os>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </os>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <features>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </features>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/disk.config"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:da:29:37"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapde4a7aac-87"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/console.log" append="off"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:15:01 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       </target>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f/console.log" append="off"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </console>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </input>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <video>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </video>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:15:01 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:15:01 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:15:01 compute-0 nova_compute[187243]: </domain>
Dec 03 00:15:01 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.774 187247 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.804 187247 DEBUG nova.network.neutron [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Updated VIF entry in instance network info cache for port de4a7aac-87a1-4237-9c69-504ca4fa7d87. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:15:01 compute-0 nova_compute[187243]: 2025-12-03 00:15:01.804 187247 DEBUG nova.network.neutron [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Updating instance_info_cache with network_info: [{"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:15:02 compute-0 nova_compute[187243]: 2025-12-03 00:15:02.266 187247 DEBUG nova.virt.libvirt.migration [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:15:02 compute-0 nova_compute[187243]: 2025-12-03 00:15:02.266 187247 INFO nova.virt.libvirt.migration [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:15:02 compute-0 nova_compute[187243]: 2025-12-03 00:15:02.459 187247 DEBUG oslo_concurrency.lockutils [req-65611612-7bb1-4395-b9e3-fec3fc27fba5 req-d8397483-d9c2-484f-8f3e-a5494dad939f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-799be56b-eb56-4319-a027-b0fe2cf7991f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:15:03 compute-0 sshd-session[218448]: Received disconnect from 23.95.37.90 port 34922:11: Bye Bye [preauth]
Dec 03 00:15:03 compute-0 sshd-session[218448]: Disconnected from authenticating user root 23.95.37.90 port 34922 [preauth]
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.283 187247 INFO nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:15:03 compute-0 kernel: tapde4a7aac-87 (unregistering): left promiscuous mode
Dec 03 00:15:03 compute-0 NetworkManager[55671]: <info>  [1764720903.3905] device (tapde4a7aac-87): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.396 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:03 compute-0 ovn_controller[95488]: 2025-12-03T00:15:03Z|00160|binding|INFO|Releasing lport de4a7aac-87a1-4237-9c69-504ca4fa7d87 from this chassis (sb_readonly=0)
Dec 03 00:15:03 compute-0 ovn_controller[95488]: 2025-12-03T00:15:03Z|00161|binding|INFO|Setting lport de4a7aac-87a1-4237-9c69-504ca4fa7d87 down in Southbound
Dec 03 00:15:03 compute-0 ovn_controller[95488]: 2025-12-03T00:15:03Z|00162|binding|INFO|Removing iface tapde4a7aac-87 ovn-installed in OVS
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.399 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.405 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:29:37 10.100.0.8'], port_security=['fa:16:3e:da:29:37 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '799be56b-eb56-4319-a027-b0fe2cf7991f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=de4a7aac-87a1-4237-9c69-504ca4fa7d87) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.407 104379 INFO neutron.agent.ovn.metadata.agent [-] Port de4a7aac-87a1-4237-9c69-504ca4fa7d87 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd unbound from our chassis
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.408 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.409 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.426 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[dc605a03-b3a9-49d0-bc8e-5fa2a9630fc9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:03 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000016.scope: Deactivated successfully.
Dec 03 00:15:03 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000016.scope: Consumed 14.581s CPU time.
Dec 03 00:15:03 compute-0 systemd-machined[153518]: Machine qemu-13-instance-00000016 terminated.
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.455 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[df8e9b14-9ea5-47c9-a3d4-a5348a6cb74f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.458 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[ce8ed496-f564-4a09-8d6f-5e90141c89f5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.487 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[489b47cd-9962-4561-8932-755c58fbb7a0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.503 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[aaf9336e-4244-433e-9bc5-76b8de5a7e23]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493690, 'reachable_time': 17216, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218473, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.516 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a64ed2b6-e8d4-4b77-8caa-3b63e33cd0fb]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee60e03c-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493704, 'tstamp': 493704}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218474, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee60e03c-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493709, 'tstamp': 493709}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218474, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.517 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.518 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.522 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.523 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee60e03c-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.523 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.523 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee60e03c-a0, col_values=(('external_ids', {'iface-id': '42f0d9e7-7c77-4247-8972-6beac3a53206'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.523 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:15:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:03.524 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f46ff24a-4590-4791-a031-e51a8982882c]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ee60e03c-ab3a-419f-84ef-62aec4b6b0dd\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.527 187247 DEBUG nova.compute.manager [req-ad1e3de9-a1d6-42ee-8d6d-451b95eddefb req-2f0f4725-8875-497f-990b-d4e50cecdfee 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.527 187247 DEBUG oslo_concurrency.lockutils [req-ad1e3de9-a1d6-42ee-8d6d-451b95eddefb req-2f0f4725-8875-497f-990b-d4e50cecdfee 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.528 187247 DEBUG oslo_concurrency.lockutils [req-ad1e3de9-a1d6-42ee-8d6d-451b95eddefb req-2f0f4725-8875-497f-990b-d4e50cecdfee 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.528 187247 DEBUG oslo_concurrency.lockutils [req-ad1e3de9-a1d6-42ee-8d6d-451b95eddefb req-2f0f4725-8875-497f-990b-d4e50cecdfee 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.528 187247 DEBUG nova.compute.manager [req-ad1e3de9-a1d6-42ee-8d6d-451b95eddefb req-2f0f4725-8875-497f-990b-d4e50cecdfee 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] No waiting events found dispatching network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.528 187247 DEBUG nova.compute.manager [req-ad1e3de9-a1d6-42ee-8d6d-451b95eddefb req-2f0f4725-8875-497f-990b-d4e50cecdfee 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.627 187247 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.628 187247 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.628 187247 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.785 187247 DEBUG nova.virt.libvirt.guest [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '799be56b-eb56-4319-a027-b0fe2cf7991f' (instance-00000016) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.786 187247 INFO nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Migration operation has completed
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.786 187247 INFO nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] _post_live_migration() is started..
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.801 187247 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.802 187247 WARNING neutronclient.v2_0.client [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:03 compute-0 nova_compute[187243]: 2025-12-03 00:15:03.921 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:04 compute-0 sshd-session[218457]: Invalid user csgoserver from 49.247.36.49 port 4424
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.130 187247 DEBUG nova.compute.manager [req-111f6cce-50ee-4b2a-82de-ef3eb87c598e req-55a83383-acee-4a57-90e3-b5b4362b551d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.130 187247 DEBUG oslo_concurrency.lockutils [req-111f6cce-50ee-4b2a-82de-ef3eb87c598e req-55a83383-acee-4a57-90e3-b5b4362b551d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.131 187247 DEBUG oslo_concurrency.lockutils [req-111f6cce-50ee-4b2a-82de-ef3eb87c598e req-55a83383-acee-4a57-90e3-b5b4362b551d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.131 187247 DEBUG oslo_concurrency.lockutils [req-111f6cce-50ee-4b2a-82de-ef3eb87c598e req-55a83383-acee-4a57-90e3-b5b4362b551d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.131 187247 DEBUG nova.compute.manager [req-111f6cce-50ee-4b2a-82de-ef3eb87c598e req-55a83383-acee-4a57-90e3-b5b4362b551d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] No waiting events found dispatching network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.131 187247 DEBUG nova.compute.manager [req-111f6cce-50ee-4b2a-82de-ef3eb87c598e req-55a83383-acee-4a57-90e3-b5b4362b551d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:15:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:04.149 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:15:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:04.150 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.150 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.234 187247 DEBUG nova.network.neutron [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port de4a7aac-87a1-4237-9c69-504ca4fa7d87 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.234 187247 DEBUG nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.235 187247 DEBUG nova.virt.libvirt.vif [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:13:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1118620355',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1118620355',id=22,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:14:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-103ztj3x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:14:41Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=799be56b-eb56-4319-a027-b0fe2cf7991f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.235 187247 DEBUG nova.network.os_vif_util [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "address": "fa:16:3e:da:29:37", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde4a7aac-87", "ovs_interfaceid": "de4a7aac-87a1-4237-9c69-504ca4fa7d87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.236 187247 DEBUG nova.network.os_vif_util [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.236 187247 DEBUG os_vif [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.238 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.238 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde4a7aac-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.240 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.242 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.243 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.243 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a2f4b668-9d3a-4828-a8e2-768d9318749b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.244 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.245 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.247 187247 INFO os_vif [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:29:37,bridge_name='br-int',has_traffic_filtering=True,id=de4a7aac-87a1-4237-9c69-504ca4fa7d87,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde4a7aac-87')
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.247 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.248 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.248 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.248 187247 DEBUG nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.249 187247 INFO nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Deleting instance files /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f_del
Dec 03 00:15:04 compute-0 nova_compute[187243]: 2025-12-03 00:15:04.249 187247 INFO nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Deletion of /var/lib/nova/instances/799be56b-eb56-4319-a027-b0fe2cf7991f_del complete
Dec 03 00:15:04 compute-0 sshd-session[218457]: Received disconnect from 49.247.36.49 port 4424:11: Bye Bye [preauth]
Dec 03 00:15:04 compute-0 sshd-session[218457]: Disconnected from invalid user csgoserver 49.247.36.49 port 4424 [preauth]
Dec 03 00:15:04 compute-0 sshd-session[218459]: Invalid user mika from 102.210.148.92 port 48730
Dec 03 00:15:04 compute-0 sshd-session[218459]: Received disconnect from 102.210.148.92 port 48730:11: Bye Bye [preauth]
Dec 03 00:15:04 compute-0 sshd-session[218459]: Disconnected from invalid user mika 102.210.148.92 port 48730 [preauth]
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.585 187247 DEBUG nova.compute.manager [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.586 187247 DEBUG oslo_concurrency.lockutils [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.586 187247 DEBUG oslo_concurrency.lockutils [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.587 187247 DEBUG oslo_concurrency.lockutils [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.587 187247 DEBUG nova.compute.manager [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] No waiting events found dispatching network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.587 187247 WARNING nova.compute.manager [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received unexpected event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 for instance with vm_state active and task_state migrating.
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.587 187247 DEBUG nova.compute.manager [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.588 187247 DEBUG oslo_concurrency.lockutils [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.588 187247 DEBUG oslo_concurrency.lockutils [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.588 187247 DEBUG oslo_concurrency.lockutils [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.588 187247 DEBUG nova.compute.manager [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] No waiting events found dispatching network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.589 187247 DEBUG nova.compute.manager [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-unplugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.589 187247 DEBUG nova.compute.manager [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.589 187247 DEBUG oslo_concurrency.lockutils [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.589 187247 DEBUG oslo_concurrency.lockutils [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.589 187247 DEBUG oslo_concurrency.lockutils [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.590 187247 DEBUG nova.compute.manager [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] No waiting events found dispatching network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.590 187247 WARNING nova.compute.manager [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received unexpected event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 for instance with vm_state active and task_state migrating.
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.590 187247 DEBUG nova.compute.manager [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.590 187247 DEBUG oslo_concurrency.lockutils [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.591 187247 DEBUG oslo_concurrency.lockutils [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.591 187247 DEBUG oslo_concurrency.lockutils [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.591 187247 DEBUG nova.compute.manager [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] No waiting events found dispatching network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:15:05 compute-0 nova_compute[187243]: 2025-12-03 00:15:05.591 187247 WARNING nova.compute.manager [req-2184b763-1cca-4078-872e-b6edc1189954 req-a1e25603-9361-490e-8c62-63d49cd51794 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Received unexpected event network-vif-plugged-de4a7aac-87a1-4237-9c69-504ca4fa7d87 for instance with vm_state active and task_state migrating.
Dec 03 00:15:05 compute-0 sshd-session[218397]: Connection closed by 45.78.219.213 port 47388 [preauth]
Dec 03 00:15:08 compute-0 nova_compute[187243]: 2025-12-03 00:15:08.923 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:09 compute-0 nova_compute[187243]: 2025-12-03 00:15:09.244 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:13 compute-0 sshd-session[218495]: Connection closed by 45.78.219.95 port 48380 [preauth]
Dec 03 00:15:13 compute-0 nova_compute[187243]: 2025-12-03 00:15:13.801 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:13 compute-0 nova_compute[187243]: 2025-12-03 00:15:13.801 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:13 compute-0 nova_compute[187243]: 2025-12-03 00:15:13.802 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "799be56b-eb56-4319-a027-b0fe2cf7991f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:13 compute-0 nova_compute[187243]: 2025-12-03 00:15:13.926 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:14 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:14.153 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:14 compute-0 nova_compute[187243]: 2025-12-03 00:15:14.246 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:14 compute-0 nova_compute[187243]: 2025-12-03 00:15:14.320 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:14 compute-0 nova_compute[187243]: 2025-12-03 00:15:14.321 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:14 compute-0 nova_compute[187243]: 2025-12-03 00:15:14.322 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:14 compute-0 nova_compute[187243]: 2025-12-03 00:15:14.322 187247 DEBUG nova.compute.resource_tracker [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:15:15 compute-0 nova_compute[187243]: 2025-12-03 00:15:15.363 187247 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:15:15 compute-0 nova_compute[187243]: 2025-12-03 00:15:15.416 187247 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:15:15 compute-0 nova_compute[187243]: 2025-12-03 00:15:15.417 187247 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:15:15 compute-0 nova_compute[187243]: 2025-12-03 00:15:15.469 187247 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:15:15 compute-0 nova_compute[187243]: 2025-12-03 00:15:15.603 187247 WARNING nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:15:15 compute-0 nova_compute[187243]: 2025-12-03 00:15:15.605 187247 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:15:15 compute-0 nova_compute[187243]: 2025-12-03 00:15:15.622 187247 DEBUG oslo_concurrency.processutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:15:15 compute-0 nova_compute[187243]: 2025-12-03 00:15:15.623 187247 DEBUG nova.compute.resource_tracker [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5663MB free_disk=73.13370132446289GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:15:15 compute-0 nova_compute[187243]: 2025-12-03 00:15:15.623 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:15 compute-0 nova_compute[187243]: 2025-12-03 00:15:15.624 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:16 compute-0 podman[218505]: 2025-12-03 00:15:16.112598893 +0000 UTC m=+0.061612066 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.)
Dec 03 00:15:16 compute-0 nova_compute[187243]: 2025-12-03 00:15:16.593 187247 DEBUG oslo_concurrency.lockutils [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "972caa01-bec4-48a2-99f2-51a323f96e88" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:16 compute-0 nova_compute[187243]: 2025-12-03 00:15:16.593 187247 DEBUG oslo_concurrency.lockutils [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:16 compute-0 nova_compute[187243]: 2025-12-03 00:15:16.593 187247 DEBUG oslo_concurrency.lockutils [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:16 compute-0 nova_compute[187243]: 2025-12-03 00:15:16.594 187247 DEBUG oslo_concurrency.lockutils [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:16 compute-0 nova_compute[187243]: 2025-12-03 00:15:16.594 187247 DEBUG oslo_concurrency.lockutils [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:16 compute-0 nova_compute[187243]: 2025-12-03 00:15:16.610 187247 INFO nova.compute.manager [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Terminating instance
Dec 03 00:15:16 compute-0 nova_compute[187243]: 2025-12-03 00:15:16.643 187247 DEBUG nova.compute.resource_tracker [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance 799be56b-eb56-4319-a027-b0fe2cf7991f refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.127 187247 DEBUG nova.compute.manager [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.150 187247 DEBUG nova.compute.resource_tracker [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:15:17 compute-0 kernel: tap062f9630-41 (unregistering): left promiscuous mode
Dec 03 00:15:17 compute-0 NetworkManager[55671]: <info>  [1764720917.1930] device (tap062f9630-41): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:15:17 compute-0 ovn_controller[95488]: 2025-12-03T00:15:17Z|00163|binding|INFO|Releasing lport 062f9630-4130-4b89-ad40-cd81d67fc31b from this chassis (sb_readonly=0)
Dec 03 00:15:17 compute-0 ovn_controller[95488]: 2025-12-03T00:15:17Z|00164|binding|INFO|Setting lport 062f9630-4130-4b89-ad40-cd81d67fc31b down in Southbound
Dec 03 00:15:17 compute-0 ovn_controller[95488]: 2025-12-03T00:15:17Z|00165|binding|INFO|Removing iface tap062f9630-41 ovn-installed in OVS
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.198 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.202 187247 DEBUG nova.compute.resource_tracker [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Instance 972caa01-bec4-48a2-99f2-51a323f96e88 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.202 187247 DEBUG nova.compute.resource_tracker [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration 9bbb892b-e56f-461e-a797-4ab09a78db13 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.202 187247 DEBUG nova.compute.resource_tracker [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.203 187247 DEBUG nova.compute.resource_tracker [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:15:15 up  1:23,  0 user,  load average: 0.42, 0.28, 0.29\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e510a0888b4c4fb5860a0f1720b8ed4b': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.205 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:78:92 10.100.0.4'], port_security=['fa:16:3e:97:78:92 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '972caa01-bec4-48a2-99f2-51a323f96e88', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=062f9630-4130-4b89-ad40-cd81d67fc31b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.207 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 062f9630-4130-4b89-ad40-cd81d67fc31b in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd unbound from our chassis
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.208 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.209 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[23208bc8-54b6-4af8-8cad-2e01374b9679]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.209 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd namespace which is not needed anymore
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.236 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:17 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000017.scope: Deactivated successfully.
Dec 03 00:15:17 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000017.scope: Consumed 13.156s CPU time.
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.270 187247 DEBUG nova.compute.provider_tree [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:15:17 compute-0 systemd-machined[153518]: Machine qemu-14-instance-00000017 terminated.
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.346 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.350 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:17 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218097]: [NOTICE]   (218101) : haproxy version is 3.0.5-8e879a5
Dec 03 00:15:17 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218097]: [NOTICE]   (218101) : path to executable is /usr/sbin/haproxy
Dec 03 00:15:17 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218097]: [WARNING]  (218101) : Exiting Master process...
Dec 03 00:15:17 compute-0 podman[218551]: 2025-12-03 00:15:17.3684078 +0000 UTC m=+0.036577797 container kill 4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 03 00:15:17 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218097]: [ALERT]    (218101) : Current worker (218103) exited with code 143 (Terminated)
Dec 03 00:15:17 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218097]: [WARNING]  (218101) : All workers exited. Exiting... (0)
Dec 03 00:15:17 compute-0 systemd[1]: libpod-4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa.scope: Deactivated successfully.
Dec 03 00:15:17 compute-0 conmon[218097]: conmon 4338702f513347b31271 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa.scope/container/memory.events
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.389 187247 INFO nova.virt.libvirt.driver [-] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Instance destroyed successfully.
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.390 187247 DEBUG nova.objects.instance [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lazy-loading 'resources' on Instance uuid 972caa01-bec4-48a2-99f2-51a323f96e88 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:15:17 compute-0 podman[218579]: 2025-12-03 00:15:17.405936969 +0000 UTC m=+0.021759100 container died 4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 03 00:15:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-03282525b8458b7616556fa160bc3ab7c0bd11209abd0a094cc884ef948d3672-merged.mount: Deactivated successfully.
Dec 03 00:15:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa-userdata-shm.mount: Deactivated successfully.
Dec 03 00:15:17 compute-0 podman[218579]: 2025-12-03 00:15:17.437272945 +0000 UTC m=+0.053095066 container cleanup 4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 03 00:15:17 compute-0 systemd[1]: libpod-conmon-4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa.scope: Deactivated successfully.
Dec 03 00:15:17 compute-0 podman[218583]: 2025-12-03 00:15:17.456964253 +0000 UTC m=+0.059536406 container remove 4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.463 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e7668a-fbf6-4400-8893-6c3cc5085a71]: (4, ("Wed Dec  3 12:15:17 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd (4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa)\n4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa\nWed Dec  3 12:15:17 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd (4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa)\n4338702f513347b312713ccec5b3aa18bd657daeb622b986a0f61b8a30dbd7fa\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.465 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[287657a6-945b-4507-abf4-26efc791fad1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.465 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.466 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[47068949-6af8-49a2-b01d-76c49cacefcc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.466 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:17 compute-0 kernel: tapee60e03c-a0: left promiscuous mode
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.518 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.534 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.537 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6c9e4852-2eeb-4f4e-93cb-b2482a283ce3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.551 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a9b089-5e6a-42ce-99ad-34d35953c569]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.552 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[43c2c244-f733-4c7a-afe8-b31a4e6a9c54]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.568 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e0537057-ac50-4dab-83dc-16495e2f8424]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493681, 'reachable_time': 43841, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218618, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.570 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:15:17 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:17.571 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[2dca3286-3ad0-4ebf-8792-9ade88afa748]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:17 compute-0 systemd[1]: run-netns-ovnmeta\x2dee60e03c\x2dab3a\x2d419f\x2d84ef\x2d62aec4b6b0dd.mount: Deactivated successfully.
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.711 187247 DEBUG nova.compute.manager [req-f2da9523-6a3e-4eb2-836d-c74c1a1d55e2 req-555a98d1-d21f-4fb3-9db5-bc70eadb0ba4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Received event network-vif-unplugged-062f9630-4130-4b89-ad40-cd81d67fc31b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.712 187247 DEBUG oslo_concurrency.lockutils [req-f2da9523-6a3e-4eb2-836d-c74c1a1d55e2 req-555a98d1-d21f-4fb3-9db5-bc70eadb0ba4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.712 187247 DEBUG oslo_concurrency.lockutils [req-f2da9523-6a3e-4eb2-836d-c74c1a1d55e2 req-555a98d1-d21f-4fb3-9db5-bc70eadb0ba4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.713 187247 DEBUG oslo_concurrency.lockutils [req-f2da9523-6a3e-4eb2-836d-c74c1a1d55e2 req-555a98d1-d21f-4fb3-9db5-bc70eadb0ba4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.713 187247 DEBUG nova.compute.manager [req-f2da9523-6a3e-4eb2-836d-c74c1a1d55e2 req-555a98d1-d21f-4fb3-9db5-bc70eadb0ba4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] No waiting events found dispatching network-vif-unplugged-062f9630-4130-4b89-ad40-cd81d67fc31b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.713 187247 DEBUG nova.compute.manager [req-f2da9523-6a3e-4eb2-836d-c74c1a1d55e2 req-555a98d1-d21f-4fb3-9db5-bc70eadb0ba4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Received event network-vif-unplugged-062f9630-4130-4b89-ad40-cd81d67fc31b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.781 187247 DEBUG nova.scheduler.client.report [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.897 187247 DEBUG nova.virt.libvirt.vif [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:14:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1091845202',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1091845202',id=23,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:14:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-gz4xevxl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:14:30Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=972caa01-bec4-48a2-99f2-51a323f96e88,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "062f9630-4130-4b89-ad40-cd81d67fc31b", "address": "fa:16:3e:97:78:92", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062f9630-41", "ovs_interfaceid": "062f9630-4130-4b89-ad40-cd81d67fc31b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.898 187247 DEBUG nova.network.os_vif_util [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converting VIF {"id": "062f9630-4130-4b89-ad40-cd81d67fc31b", "address": "fa:16:3e:97:78:92", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062f9630-41", "ovs_interfaceid": "062f9630-4130-4b89-ad40-cd81d67fc31b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.899 187247 DEBUG nova.network.os_vif_util [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:78:92,bridge_name='br-int',has_traffic_filtering=True,id=062f9630-4130-4b89-ad40-cd81d67fc31b,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062f9630-41') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.900 187247 DEBUG os_vif [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:78:92,bridge_name='br-int',has_traffic_filtering=True,id=062f9630-4130-4b89-ad40-cd81d67fc31b,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062f9630-41') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.903 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.904 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap062f9630-41, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.905 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.909 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.910 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.910 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d8354bac-4fca-4e7f-8272-a48aa3307785) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.912 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.914 187247 INFO os_vif [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:78:92,bridge_name='br-int',has_traffic_filtering=True,id=062f9630-4130-4b89-ad40-cd81d67fc31b,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062f9630-41')
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.915 187247 INFO nova.virt.libvirt.driver [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Deleting instance files /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88_del
Dec 03 00:15:17 compute-0 nova_compute[187243]: 2025-12-03 00:15:17.915 187247 INFO nova.virt.libvirt.driver [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Deletion of /var/lib/nova/instances/972caa01-bec4-48a2-99f2-51a323f96e88_del complete
Dec 03 00:15:18 compute-0 nova_compute[187243]: 2025-12-03 00:15:18.294 187247 DEBUG nova.compute.resource_tracker [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:15:18 compute-0 nova_compute[187243]: 2025-12-03 00:15:18.294 187247 DEBUG oslo_concurrency.lockutils [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.671s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:18 compute-0 nova_compute[187243]: 2025-12-03 00:15:18.325 187247 INFO nova.compute.manager [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 03 00:15:18 compute-0 nova_compute[187243]: 2025-12-03 00:15:18.427 187247 INFO nova.compute.manager [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Took 1.30 seconds to destroy the instance on the hypervisor.
Dec 03 00:15:18 compute-0 nova_compute[187243]: 2025-12-03 00:15:18.427 187247 DEBUG oslo.service.backend._eventlet.loopingcall [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:15:18 compute-0 nova_compute[187243]: 2025-12-03 00:15:18.428 187247 DEBUG nova.compute.manager [-] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:15:18 compute-0 nova_compute[187243]: 2025-12-03 00:15:18.428 187247 DEBUG nova.network.neutron [-] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:15:18 compute-0 nova_compute[187243]: 2025-12-03 00:15:18.428 187247 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:18 compute-0 nova_compute[187243]: 2025-12-03 00:15:18.804 187247 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:18 compute-0 nova_compute[187243]: 2025-12-03 00:15:18.928 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:19 compute-0 podman[218619]: 2025-12-03 00:15:19.100901941 +0000 UTC m=+0.057900516 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:15:19 compute-0 nova_compute[187243]: 2025-12-03 00:15:19.144 187247 DEBUG nova.compute.manager [req-456a0acd-8430-4215-8825-26a399420673 req-805c2e86-7503-4e39-ad4a-7471e47f0926 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Received event network-vif-deleted-062f9630-4130-4b89-ad40-cd81d67fc31b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:19 compute-0 nova_compute[187243]: 2025-12-03 00:15:19.145 187247 INFO nova.compute.manager [req-456a0acd-8430-4215-8825-26a399420673 req-805c2e86-7503-4e39-ad4a-7471e47f0926 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Neutron deleted interface 062f9630-4130-4b89-ad40-cd81d67fc31b; detaching it from the instance and deleting it from the info cache
Dec 03 00:15:19 compute-0 nova_compute[187243]: 2025-12-03 00:15:19.145 187247 DEBUG nova.network.neutron [req-456a0acd-8430-4215-8825-26a399420673 req-805c2e86-7503-4e39-ad4a-7471e47f0926 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:15:19 compute-0 nova_compute[187243]: 2025-12-03 00:15:19.408 187247 INFO nova.scheduler.client.report [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration 9bbb892b-e56f-461e-a797-4ab09a78db13
Dec 03 00:15:19 compute-0 nova_compute[187243]: 2025-12-03 00:15:19.409 187247 DEBUG nova.virt.libvirt.driver [None req-2b3f8f49-dead-4eaf-9b2c-8857a95fa066 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 799be56b-eb56-4319-a027-b0fe2cf7991f] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:15:19 compute-0 nova_compute[187243]: 2025-12-03 00:15:19.583 187247 DEBUG nova.network.neutron [-] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:15:19 compute-0 nova_compute[187243]: 2025-12-03 00:15:19.652 187247 DEBUG nova.compute.manager [req-456a0acd-8430-4215-8825-26a399420673 req-805c2e86-7503-4e39-ad4a-7471e47f0926 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Detach interface failed, port_id=062f9630-4130-4b89-ad40-cd81d67fc31b, reason: Instance 972caa01-bec4-48a2-99f2-51a323f96e88 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 03 00:15:19 compute-0 nova_compute[187243]: 2025-12-03 00:15:19.765 187247 DEBUG nova.compute.manager [req-31cfbe53-b68c-445a-b03d-1a28b59b0593 req-5547f97c-9c07-4fb7-8dd7-5d3c7ece9da6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Received event network-vif-unplugged-062f9630-4130-4b89-ad40-cd81d67fc31b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:19 compute-0 nova_compute[187243]: 2025-12-03 00:15:19.765 187247 DEBUG oslo_concurrency.lockutils [req-31cfbe53-b68c-445a-b03d-1a28b59b0593 req-5547f97c-9c07-4fb7-8dd7-5d3c7ece9da6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:19 compute-0 nova_compute[187243]: 2025-12-03 00:15:19.766 187247 DEBUG oslo_concurrency.lockutils [req-31cfbe53-b68c-445a-b03d-1a28b59b0593 req-5547f97c-9c07-4fb7-8dd7-5d3c7ece9da6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:19 compute-0 nova_compute[187243]: 2025-12-03 00:15:19.766 187247 DEBUG oslo_concurrency.lockutils [req-31cfbe53-b68c-445a-b03d-1a28b59b0593 req-5547f97c-9c07-4fb7-8dd7-5d3c7ece9da6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:19 compute-0 nova_compute[187243]: 2025-12-03 00:15:19.766 187247 DEBUG nova.compute.manager [req-31cfbe53-b68c-445a-b03d-1a28b59b0593 req-5547f97c-9c07-4fb7-8dd7-5d3c7ece9da6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] No waiting events found dispatching network-vif-unplugged-062f9630-4130-4b89-ad40-cd81d67fc31b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:15:19 compute-0 nova_compute[187243]: 2025-12-03 00:15:19.766 187247 DEBUG nova.compute.manager [req-31cfbe53-b68c-445a-b03d-1a28b59b0593 req-5547f97c-9c07-4fb7-8dd7-5d3c7ece9da6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Received event network-vif-unplugged-062f9630-4130-4b89-ad40-cd81d67fc31b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:15:20 compute-0 nova_compute[187243]: 2025-12-03 00:15:20.090 187247 INFO nova.compute.manager [-] [instance: 972caa01-bec4-48a2-99f2-51a323f96e88] Took 1.66 seconds to deallocate network for instance.
Dec 03 00:15:20 compute-0 nova_compute[187243]: 2025-12-03 00:15:20.620 187247 DEBUG oslo_concurrency.lockutils [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:20 compute-0 nova_compute[187243]: 2025-12-03 00:15:20.620 187247 DEBUG oslo_concurrency.lockutils [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:20 compute-0 nova_compute[187243]: 2025-12-03 00:15:20.654 187247 DEBUG nova.compute.provider_tree [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:15:21 compute-0 nova_compute[187243]: 2025-12-03 00:15:21.164 187247 DEBUG nova.scheduler.client.report [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:15:21 compute-0 nova_compute[187243]: 2025-12-03 00:15:21.678 187247 DEBUG oslo_concurrency.lockutils [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.058s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:21 compute-0 nova_compute[187243]: 2025-12-03 00:15:21.724 187247 INFO nova.scheduler.client.report [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Deleted allocations for instance 972caa01-bec4-48a2-99f2-51a323f96e88
Dec 03 00:15:22 compute-0 nova_compute[187243]: 2025-12-03 00:15:22.749 187247 DEBUG oslo_concurrency.lockutils [None req-a2ba8cc7-b56a-4dbf-a244-a091622dc381 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "972caa01-bec4-48a2-99f2-51a323f96e88" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.156s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:22 compute-0 nova_compute[187243]: 2025-12-03 00:15:22.911 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:23 compute-0 nova_compute[187243]: 2025-12-03 00:15:23.930 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:27 compute-0 sshd-session[218639]: Received disconnect from 61.220.235.10 port 57324:11: Bye Bye [preauth]
Dec 03 00:15:27 compute-0 sshd-session[218639]: Disconnected from authenticating user root 61.220.235.10 port 57324 [preauth]
Dec 03 00:15:27 compute-0 nova_compute[187243]: 2025-12-03 00:15:27.912 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:28 compute-0 nova_compute[187243]: 2025-12-03 00:15:28.932 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:29 compute-0 podman[218641]: 2025-12-03 00:15:29.096667943 +0000 UTC m=+0.052559353 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:15:29 compute-0 podman[197600]: time="2025-12-03T00:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:15:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:15:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec 03 00:15:31 compute-0 openstack_network_exporter[199746]: ERROR   00:15:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:15:31 compute-0 openstack_network_exporter[199746]: ERROR   00:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:15:31 compute-0 openstack_network_exporter[199746]: ERROR   00:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:15:31 compute-0 openstack_network_exporter[199746]: ERROR   00:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:15:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:15:31 compute-0 openstack_network_exporter[199746]: ERROR   00:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:15:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:15:32 compute-0 podman[218667]: 2025-12-03 00:15:32.094543038 +0000 UTC m=+0.051783143 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 03 00:15:32 compute-0 podman[218668]: 2025-12-03 00:15:32.184972188 +0000 UTC m=+0.134989484 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller)
Dec 03 00:15:32 compute-0 nova_compute[187243]: 2025-12-03 00:15:32.914 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:33 compute-0 nova_compute[187243]: 2025-12-03 00:15:33.257 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:33 compute-0 nova_compute[187243]: 2025-12-03 00:15:33.258 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:33 compute-0 nova_compute[187243]: 2025-12-03 00:15:33.764 187247 DEBUG nova.compute.manager [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:15:33 compute-0 nova_compute[187243]: 2025-12-03 00:15:33.935 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:34 compute-0 nova_compute[187243]: 2025-12-03 00:15:34.820 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:34 compute-0 nova_compute[187243]: 2025-12-03 00:15:34.821 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:34 compute-0 nova_compute[187243]: 2025-12-03 00:15:34.828 187247 DEBUG nova.virt.hardware [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:15:34 compute-0 nova_compute[187243]: 2025-12-03 00:15:34.828 187247 INFO nova.compute.claims [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:15:35 compute-0 nova_compute[187243]: 2025-12-03 00:15:35.880 187247 DEBUG nova.compute.provider_tree [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:15:36 compute-0 nova_compute[187243]: 2025-12-03 00:15:36.386 187247 DEBUG nova.scheduler.client.report [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:15:36 compute-0 nova_compute[187243]: 2025-12-03 00:15:36.896 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.076s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:36 compute-0 nova_compute[187243]: 2025-12-03 00:15:36.897 187247 DEBUG nova.compute.manager [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:15:37 compute-0 nova_compute[187243]: 2025-12-03 00:15:37.406 187247 DEBUG nova.compute.manager [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:15:37 compute-0 nova_compute[187243]: 2025-12-03 00:15:37.406 187247 DEBUG nova.network.neutron [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:15:37 compute-0 nova_compute[187243]: 2025-12-03 00:15:37.406 187247 WARNING neutronclient.v2_0.client [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:37 compute-0 nova_compute[187243]: 2025-12-03 00:15:37.407 187247 WARNING neutronclient.v2_0.client [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:37 compute-0 nova_compute[187243]: 2025-12-03 00:15:37.841 187247 DEBUG nova.network.neutron [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Successfully created port: 75f7bf8b-141c-44e2-be3c-1fdae9af1077 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:15:37 compute-0 nova_compute[187243]: 2025-12-03 00:15:37.913 187247 INFO nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:15:37 compute-0 nova_compute[187243]: 2025-12-03 00:15:37.915 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:38 compute-0 nova_compute[187243]: 2025-12-03 00:15:38.420 187247 DEBUG nova.compute.manager [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:15:38 compute-0 nova_compute[187243]: 2025-12-03 00:15:38.936 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.434 187247 DEBUG nova.compute.manager [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.435 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.436 187247 INFO nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Creating image(s)
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.436 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.437 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.437 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.438 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.441 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.442 187247 DEBUG oslo_concurrency.processutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.494 187247 DEBUG oslo_concurrency.processutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.496 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.496 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.496 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.500 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.500 187247 DEBUG oslo_concurrency.processutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.551 187247 DEBUG oslo_concurrency.processutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.552 187247 DEBUG oslo_concurrency.processutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.585 187247 DEBUG oslo_concurrency.processutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.586 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.586 187247 DEBUG oslo_concurrency.processutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.637 187247 DEBUG oslo_concurrency.processutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.638 187247 DEBUG nova.virt.disk.api [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Checking if we can resize image /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.639 187247 DEBUG oslo_concurrency.processutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.691 187247 DEBUG oslo_concurrency.processutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.691 187247 DEBUG nova.virt.disk.api [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Cannot resize image /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.692 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.692 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Ensure instance console log exists: /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.692 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.693 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.693 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.909 187247 DEBUG nova.network.neutron [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Successfully updated port: 75f7bf8b-141c-44e2-be3c-1fdae9af1077 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.961 187247 DEBUG nova.compute.manager [req-2543c8f9-0000-49d2-9b0d-f40ab07208d3 req-6715684e-e0e8-4aae-a971-0c455950c4bb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-changed-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.962 187247 DEBUG nova.compute.manager [req-2543c8f9-0000-49d2-9b0d-f40ab07208d3 req-6715684e-e0e8-4aae-a971-0c455950c4bb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Refreshing instance network info cache due to event network-changed-75f7bf8b-141c-44e2-be3c-1fdae9af1077. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.962 187247 DEBUG oslo_concurrency.lockutils [req-2543c8f9-0000-49d2-9b0d-f40ab07208d3 req-6715684e-e0e8-4aae-a971-0c455950c4bb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.962 187247 DEBUG oslo_concurrency.lockutils [req-2543c8f9-0000-49d2-9b0d-f40ab07208d3 req-6715684e-e0e8-4aae-a971-0c455950c4bb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:15:39 compute-0 nova_compute[187243]: 2025-12-03 00:15:39.962 187247 DEBUG nova.network.neutron [req-2543c8f9-0000-49d2-9b0d-f40ab07208d3 req-6715684e-e0e8-4aae-a971-0c455950c4bb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Refreshing network info cache for port 75f7bf8b-141c-44e2-be3c-1fdae9af1077 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:15:40 compute-0 nova_compute[187243]: 2025-12-03 00:15:40.416 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:15:40 compute-0 nova_compute[187243]: 2025-12-03 00:15:40.467 187247 WARNING neutronclient.v2_0.client [req-2543c8f9-0000-49d2-9b0d-f40ab07208d3 req-6715684e-e0e8-4aae-a971-0c455950c4bb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:40 compute-0 nova_compute[187243]: 2025-12-03 00:15:40.823 187247 DEBUG nova.network.neutron [req-2543c8f9-0000-49d2-9b0d-f40ab07208d3 req-6715684e-e0e8-4aae-a971-0c455950c4bb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:15:40 compute-0 nova_compute[187243]: 2025-12-03 00:15:40.989 187247 DEBUG nova.network.neutron [req-2543c8f9-0000-49d2-9b0d-f40ab07208d3 req-6715684e-e0e8-4aae-a971-0c455950c4bb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:15:41 compute-0 nova_compute[187243]: 2025-12-03 00:15:41.495 187247 DEBUG oslo_concurrency.lockutils [req-2543c8f9-0000-49d2-9b0d-f40ab07208d3 req-6715684e-e0e8-4aae-a971-0c455950c4bb 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:15:41 compute-0 nova_compute[187243]: 2025-12-03 00:15:41.496 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquired lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:15:41 compute-0 nova_compute[187243]: 2025-12-03 00:15:41.497 187247 DEBUG nova.network.neutron [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:15:42 compute-0 nova_compute[187243]: 2025-12-03 00:15:42.818 187247 DEBUG nova.network.neutron [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:15:42 compute-0 nova_compute[187243]: 2025-12-03 00:15:42.916 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:42 compute-0 nova_compute[187243]: 2025-12-03 00:15:42.998 187247 WARNING neutronclient.v2_0.client [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:43 compute-0 sshd-session[218728]: Invalid user dangulo from 101.47.140.127 port 54588
Dec 03 00:15:43 compute-0 sshd-session[218728]: Received disconnect from 101.47.140.127 port 54588:11: Bye Bye [preauth]
Dec 03 00:15:43 compute-0 sshd-session[218728]: Disconnected from invalid user dangulo 101.47.140.127 port 54588 [preauth]
Dec 03 00:15:43 compute-0 nova_compute[187243]: 2025-12-03 00:15:43.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:43 compute-0 nova_compute[187243]: 2025-12-03 00:15:43.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:43 compute-0 nova_compute[187243]: 2025-12-03 00:15:43.824 187247 DEBUG nova.network.neutron [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Updating instance_info_cache with network_info: [{"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:15:43 compute-0 nova_compute[187243]: 2025-12-03 00:15:43.938 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.332 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Releasing lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.332 187247 DEBUG nova.compute.manager [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Instance network_info: |[{"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.334 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Start _get_guest_xml network_info=[{"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.337 187247 WARNING nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.338 187247 DEBUG nova.virt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalanceStrategy-server-984503060', uuid='c1df5044-c7ad-42e6-93bd-4b5a853ab3b8'), owner=OwnerMeta(userid='0473307cd38b412cbfdbd093053eb1af', username='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin', projectid='e510a0888b4c4fb5860a0f1720b8ed4b', projectname='tempest-TestExecuteWorkloadBalanceStrategy-1290727110'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='tempest-watcher_flavor-1545796181', flavorid='961ca853-f9ec-479e-bfb6-9bdd23ae3e33', memory_mb=1151, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={}, swap=0), network_info=[{"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720944.3388178) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.343 187247 DEBUG nova.virt.libvirt.host [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.343 187247 DEBUG nova.virt.libvirt.host [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.346 187247 DEBUG nova.virt.libvirt.host [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.346 187247 DEBUG nova.virt.libvirt.host [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.347 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.348 187247 DEBUG nova.virt.hardware [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T00:15:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='961ca853-f9ec-479e-bfb6-9bdd23ae3e33',id=3,is_public=True,memory_mb=1151,name='tempest-watcher_flavor-1545796181',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.348 187247 DEBUG nova.virt.hardware [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.348 187247 DEBUG nova.virt.hardware [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.348 187247 DEBUG nova.virt.hardware [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.349 187247 DEBUG nova.virt.hardware [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.349 187247 DEBUG nova.virt.hardware [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.349 187247 DEBUG nova.virt.hardware [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.349 187247 DEBUG nova.virt.hardware [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.350 187247 DEBUG nova.virt.hardware [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.350 187247 DEBUG nova.virt.hardware [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.350 187247 DEBUG nova.virt.hardware [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.353 187247 DEBUG nova.virt.libvirt.vif [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:15:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-984503060',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-984503060',id=24,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-4tk0mv8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:15:38Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=c1df5044-c7ad-42e6-93bd-4b5a853ab3b8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.354 187247 DEBUG nova.network.os_vif_util [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converting VIF {"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.354 187247 DEBUG nova.network.os_vif_util [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.355 187247 DEBUG nova.objects.instance [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lazy-loading 'pci_devices' on Instance uuid c1df5044-c7ad-42e6-93bd-4b5a853ab3b8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.863 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:15:44 compute-0 nova_compute[187243]:   <uuid>c1df5044-c7ad-42e6-93bd-4b5a853ab3b8</uuid>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   <name>instance-00000018</name>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   <memory>1178624</memory>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-984503060</nova:name>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:15:44</nova:creationTime>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <nova:flavor name="tempest-watcher_flavor-1545796181" id="961ca853-f9ec-479e-bfb6-9bdd23ae3e33">
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:memory>1151</nova:memory>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:extraSpecs/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:15:44 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:user uuid="0473307cd38b412cbfdbd093053eb1af">tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin</nova:user>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:project uuid="e510a0888b4c4fb5860a0f1720b8ed4b">tempest-TestExecuteWorkloadBalanceStrategy-1290727110</nova:project>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         <nova:port uuid="75f7bf8b-141c-44e2-be3c-1fdae9af1077">
Dec 03 00:15:44 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <system>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <entry name="serial">c1df5044-c7ad-42e6-93bd-4b5a853ab3b8</entry>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <entry name="uuid">c1df5044-c7ad-42e6-93bd-4b5a853ab3b8</entry>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     </system>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   <os>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   </os>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   <features>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   </features>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk.config"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:e6:86:86"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <target dev="tap75f7bf8b-14"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/console.log" append="off"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <video>
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     </video>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:15:44 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:15:44 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:15:44 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:15:44 compute-0 nova_compute[187243]: </domain>
Dec 03 00:15:44 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.866 187247 DEBUG nova.compute.manager [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Preparing to wait for external event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.867 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.867 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.868 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.869 187247 DEBUG nova.virt.libvirt.vif [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:15:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-984503060',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-984503060',id=24,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-4tk0mv8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:15:38Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=c1df5044-c7ad-42e6-93bd-4b5a853ab3b8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.869 187247 DEBUG nova.network.os_vif_util [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converting VIF {"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.870 187247 DEBUG nova.network.os_vif_util [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.871 187247 DEBUG os_vif [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.872 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.872 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.873 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.874 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.875 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '31f8fe63-1f27-59b5-b890-f896a59a7ed5', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.919 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.921 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.925 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.925 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75f7bf8b-14, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.926 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap75f7bf8b-14, col_values=(('qos', UUID('f2ae0b30-19b3-4498-aa0e-b567e7fb6f77')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.926 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap75f7bf8b-14, col_values=(('external_ids', {'iface-id': '75f7bf8b-141c-44e2-be3c-1fdae9af1077', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e6:86:86', 'vm-uuid': 'c1df5044-c7ad-42e6-93bd-4b5a853ab3b8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.928 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:44 compute-0 NetworkManager[55671]: <info>  [1764720944.9297] manager: (tap75f7bf8b-14): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.931 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.935 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:44 compute-0 nova_compute[187243]: 2025-12-03 00:15:44.936 187247 INFO os_vif [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14')
Dec 03 00:15:46 compute-0 nova_compute[187243]: 2025-12-03 00:15:46.717 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:15:46 compute-0 nova_compute[187243]: 2025-12-03 00:15:46.718 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:15:46 compute-0 nova_compute[187243]: 2025-12-03 00:15:46.718 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] No VIF found with MAC fa:16:3e:e6:86:86, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:15:46 compute-0 nova_compute[187243]: 2025-12-03 00:15:46.719 187247 INFO nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Using config drive
Dec 03 00:15:47 compute-0 podman[218734]: 2025-12-03 00:15:47.098540893 +0000 UTC m=+0.055079745 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.229 187247 WARNING neutronclient.v2_0.client [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.405 187247 INFO nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Creating config drive at /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk.config
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.411 187247 DEBUG oslo_concurrency.processutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp6oczsvfj execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.535 187247 DEBUG oslo_concurrency.processutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp6oczsvfj" returned: 0 in 0.124s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:15:47 compute-0 kernel: tap75f7bf8b-14: entered promiscuous mode
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:15:47 compute-0 NetworkManager[55671]: <info>  [1764720947.5941] manager: (tap75f7bf8b-14): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Dec 03 00:15:47 compute-0 ovn_controller[95488]: 2025-12-03T00:15:47Z|00166|binding|INFO|Claiming lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 for this chassis.
Dec 03 00:15:47 compute-0 ovn_controller[95488]: 2025-12-03T00:15:47Z|00167|binding|INFO|75f7bf8b-141c-44e2-be3c-1fdae9af1077: Claiming fa:16:3e:e6:86:86 10.100.0.3
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.594 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.602 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:86:86 10.100.0.3'], port_security=['fa:16:3e:e6:86:86 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c1df5044-c7ad-42e6-93bd-4b5a853ab3b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=75f7bf8b-141c-44e2-be3c-1fdae9af1077) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.603 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 75f7bf8b-141c-44e2-be3c-1fdae9af1077 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd bound to our chassis
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.605 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:15:47 compute-0 ovn_controller[95488]: 2025-12-03T00:15:47Z|00168|binding|INFO|Setting lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 ovn-installed in OVS
Dec 03 00:15:47 compute-0 ovn_controller[95488]: 2025-12-03T00:15:47Z|00169|binding|INFO|Setting lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 up in Southbound
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.616 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.618 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[34170ea0-5052-4337-8c6f-646527e9d331]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.620 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.619 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee60e03c-a1 in ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.622 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee60e03c-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:15:47 compute-0 systemd-udevd[218771]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.622 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2c5e0c36-0477-49cb-a2cf-14ec1657ff08]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.623 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[19e5a225-d0e4-4bb6-b20e-7d3c5d69e586]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.634 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[65460f47-2a19-4d53-8b95-26578be5a30f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 NetworkManager[55671]: <info>  [1764720947.6382] device (tap75f7bf8b-14): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:15:47 compute-0 NetworkManager[55671]: <info>  [1764720947.6401] device (tap75f7bf8b-14): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.640 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1cac6502-01e9-4c33-8b60-d3cee59bf86b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 systemd-machined[153518]: New machine qemu-15-instance-00000018.
Dec 03 00:15:47 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000018.
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.672 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[407722a0-b768-4f66-bc14-d720bf30405f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.677 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[798b31d4-07c1-4bcf-b1a1-88310e7bcd3c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 systemd-udevd[218778]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:15:47 compute-0 NetworkManager[55671]: <info>  [1764720947.6796] manager: (tapee60e03c-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.709 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[214ad8a4-e35b-4d63-a547-225a0324c6e7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.711 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[b201a37c-0a98-4b15-83e3-635b3f697405]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 NetworkManager[55671]: <info>  [1764720947.7290] device (tapee60e03c-a0): carrier: link connected
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.733 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[9d30d29c-8f8e-4c1f-bce5-8f75b5e43c1a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.746 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8e776608-072a-4cd9-9484-89f4439f8396]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503837, 'reachable_time': 40641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218806, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.756 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[54683847-9da9-4d01-a573-77c587a17897]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe99:2bfe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503837, 'tstamp': 503837}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218807, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.759 187247 DEBUG nova.compute.manager [req-025f6133-6356-4aad-8c60-59199b47e461 req-725bcba8-5daf-4b28-b1cf-697336558656 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.759 187247 DEBUG oslo_concurrency.lockutils [req-025f6133-6356-4aad-8c60-59199b47e461 req-725bcba8-5daf-4b28-b1cf-697336558656 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.759 187247 DEBUG oslo_concurrency.lockutils [req-025f6133-6356-4aad-8c60-59199b47e461 req-725bcba8-5daf-4b28-b1cf-697336558656 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.760 187247 DEBUG oslo_concurrency.lockutils [req-025f6133-6356-4aad-8c60-59199b47e461 req-725bcba8-5daf-4b28-b1cf-697336558656 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.760 187247 DEBUG nova.compute.manager [req-025f6133-6356-4aad-8c60-59199b47e461 req-725bcba8-5daf-4b28-b1cf-697336558656 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Processing event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.772 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4c6281d8-a9e4-458b-aa26-ae5cd034e175]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503837, 'reachable_time': 40641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218809, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.794 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[159cac54-82ea-4614-b69b-1b9c55fca052]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.871 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[66d392a6-8bb5-454b-a617-c7f01166dc81]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.872 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.873 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.873 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee60e03c-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:47 compute-0 kernel: tapee60e03c-a0: entered promiscuous mode
Dec 03 00:15:47 compute-0 NetworkManager[55671]: <info>  [1764720947.8751] manager: (tapee60e03c-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.874 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.877 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee60e03c-a0, col_values=(('external_ids', {'iface-id': '42f0d9e7-7c77-4247-8972-6beac3a53206'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:15:47 compute-0 ovn_controller[95488]: 2025-12-03T00:15:47Z|00170|binding|INFO|Releasing lport 42f0d9e7-7c77-4247-8972-6beac3a53206 from this chassis (sb_readonly=0)
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.879 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.888 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8895e485-7d0b-45f0-bd66-0868bcae940a]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.889 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.889 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.890 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for ee60e03c-ab3a-419f-84ef-62aec4b6b0dd disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.890 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.890 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e0bcbd4b-13b9-4748-b47f-7257662e224b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.891 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:15:47 compute-0 nova_compute[187243]: 2025-12-03 00:15:47.891 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.891 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4f11d3f1-dd7c-4fc8-ae3d-c14d415b0d2b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.892 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: global
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: defaults
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     log global
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:15:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:15:47.893 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'env', 'PROCESS_TAG=haproxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:15:48 compute-0 nova_compute[187243]: 2025-12-03 00:15:48.213 187247 DEBUG nova.compute.manager [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:15:48 compute-0 nova_compute[187243]: 2025-12-03 00:15:48.218 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:15:48 compute-0 nova_compute[187243]: 2025-12-03 00:15:48.225 187247 INFO nova.virt.libvirt.driver [-] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Instance spawned successfully.
Dec 03 00:15:48 compute-0 nova_compute[187243]: 2025-12-03 00:15:48.225 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:15:48 compute-0 podman[218849]: 2025-12-03 00:15:48.342070566 +0000 UTC m=+0.067786350 container create cbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:15:48 compute-0 systemd[1]: Started libpod-conmon-cbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137.scope.
Dec 03 00:15:48 compute-0 podman[218849]: 2025-12-03 00:15:48.302484195 +0000 UTC m=+0.028200079 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:15:48 compute-0 systemd[1]: Started libcrun container.
Dec 03 00:15:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9bca642642759c127988d41f80f182411e5121a925e3d2446f94f2dc42754092/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:15:48 compute-0 podman[218849]: 2025-12-03 00:15:48.479427487 +0000 UTC m=+0.205143291 container init cbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 00:15:48 compute-0 podman[218849]: 2025-12-03 00:15:48.487707822 +0000 UTC m=+0.213423616 container start cbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:15:48 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218865]: [NOTICE]   (218869) : New worker (218871) forked
Dec 03 00:15:48 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218865]: [NOTICE]   (218869) : Loading success.
Dec 03 00:15:48 compute-0 nova_compute[187243]: 2025-12-03 00:15:48.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:48 compute-0 nova_compute[187243]: 2025-12-03 00:15:48.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:48 compute-0 nova_compute[187243]: 2025-12-03 00:15:48.742 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:15:48 compute-0 nova_compute[187243]: 2025-12-03 00:15:48.742 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:15:48 compute-0 nova_compute[187243]: 2025-12-03 00:15:48.743 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:15:48 compute-0 nova_compute[187243]: 2025-12-03 00:15:48.744 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:15:48 compute-0 nova_compute[187243]: 2025-12-03 00:15:48.745 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:15:48 compute-0 nova_compute[187243]: 2025-12-03 00:15:48.745 187247 DEBUG nova.virt.libvirt.driver [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:15:48 compute-0 nova_compute[187243]: 2025-12-03 00:15:48.939 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.106 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.107 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.107 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.107 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:15:49 compute-0 podman[218881]: 2025-12-03 00:15:49.229577682 +0000 UTC m=+0.070912497 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.258 187247 INFO nova.compute.manager [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Took 9.82 seconds to spawn the instance on the hypervisor.
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.259 187247 DEBUG nova.compute.manager [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.788 187247 INFO nova.compute.manager [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Took 15.01 seconds to build instance.
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.844 187247 DEBUG nova.compute.manager [req-1fb18da9-e191-48ab-a133-45d6979f6864 req-931f2213-4c7b-4fe9-921f-19a3a60c2927 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.844 187247 DEBUG oslo_concurrency.lockutils [req-1fb18da9-e191-48ab-a133-45d6979f6864 req-931f2213-4c7b-4fe9-921f-19a3a60c2927 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.845 187247 DEBUG oslo_concurrency.lockutils [req-1fb18da9-e191-48ab-a133-45d6979f6864 req-931f2213-4c7b-4fe9-921f-19a3a60c2927 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.845 187247 DEBUG oslo_concurrency.lockutils [req-1fb18da9-e191-48ab-a133-45d6979f6864 req-931f2213-4c7b-4fe9-921f-19a3a60c2927 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.845 187247 DEBUG nova.compute.manager [req-1fb18da9-e191-48ab-a133-45d6979f6864 req-931f2213-4c7b-4fe9-921f-19a3a60c2927 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No waiting events found dispatching network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.846 187247 WARNING nova.compute.manager [req-1fb18da9-e191-48ab-a133-45d6979f6864 req-931f2213-4c7b-4fe9-921f-19a3a60c2927 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received unexpected event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with vm_state active and task_state None.
Dec 03 00:15:49 compute-0 nova_compute[187243]: 2025-12-03 00:15:49.929 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:50 compute-0 nova_compute[187243]: 2025-12-03 00:15:50.158 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:15:50 compute-0 nova_compute[187243]: 2025-12-03 00:15:50.216 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:15:50 compute-0 nova_compute[187243]: 2025-12-03 00:15:50.217 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:15:50 compute-0 nova_compute[187243]: 2025-12-03 00:15:50.293 187247 DEBUG oslo_concurrency.lockutils [None req-f6f67001-1c8d-4871-972e-35424c0b5fc0 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.036s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:50 compute-0 nova_compute[187243]: 2025-12-03 00:15:50.299 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:15:50 compute-0 nova_compute[187243]: 2025-12-03 00:15:50.422 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:15:50 compute-0 nova_compute[187243]: 2025-12-03 00:15:50.425 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:15:50 compute-0 nova_compute[187243]: 2025-12-03 00:15:50.445 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:15:50 compute-0 nova_compute[187243]: 2025-12-03 00:15:50.446 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5742MB free_disk=73.16145324707031GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:15:50 compute-0 nova_compute[187243]: 2025-12-03 00:15:50.446 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:50 compute-0 nova_compute[187243]: 2025-12-03 00:15:50.446 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:51 compute-0 nova_compute[187243]: 2025-12-03 00:15:51.517 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance c1df5044-c7ad-42e6-93bd-4b5a853ab3b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:15:51 compute-0 nova_compute[187243]: 2025-12-03 00:15:51.517 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:15:51 compute-0 nova_compute[187243]: 2025-12-03 00:15:51.518 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1663MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:15:50 up  1:24,  0 user,  load average: 0.39, 0.28, 0.28\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e510a0888b4c4fb5860a0f1720b8ed4b': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:15:51 compute-0 nova_compute[187243]: 2025-12-03 00:15:51.562 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:15:52 compute-0 nova_compute[187243]: 2025-12-03 00:15:52.069 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:15:52 compute-0 nova_compute[187243]: 2025-12-03 00:15:52.577 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:15:52 compute-0 nova_compute[187243]: 2025-12-03 00:15:52.578 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.132s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:53 compute-0 nova_compute[187243]: 2025-12-03 00:15:53.941 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:54 compute-0 nova_compute[187243]: 2025-12-03 00:15:54.929 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "72f66e11-43aa-4598-95a3-697bee26b5e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:54 compute-0 nova_compute[187243]: 2025-12-03 00:15:54.930 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:54 compute-0 nova_compute[187243]: 2025-12-03 00:15:54.974 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:55 compute-0 nova_compute[187243]: 2025-12-03 00:15:55.434 187247 DEBUG nova.compute.manager [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:15:55 compute-0 nova_compute[187243]: 2025-12-03 00:15:55.992 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:15:55 compute-0 nova_compute[187243]: 2025-12-03 00:15:55.993 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:15:55 compute-0 nova_compute[187243]: 2025-12-03 00:15:55.999 187247 DEBUG nova.virt.hardware [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:15:55 compute-0 nova_compute[187243]: 2025-12-03 00:15:55.999 187247 INFO nova.compute.claims [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:15:56 compute-0 nova_compute[187243]: 2025-12-03 00:15:56.574 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:56 compute-0 nova_compute[187243]: 2025-12-03 00:15:56.574 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:56 compute-0 nova_compute[187243]: 2025-12-03 00:15:56.575 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:15:57 compute-0 nova_compute[187243]: 2025-12-03 00:15:57.078 187247 DEBUG nova.compute.provider_tree [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:15:58 compute-0 nova_compute[187243]: 2025-12-03 00:15:58.110 187247 DEBUG nova.scheduler.client.report [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:15:58 compute-0 nova_compute[187243]: 2025-12-03 00:15:58.620 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.627s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:15:58 compute-0 nova_compute[187243]: 2025-12-03 00:15:58.621 187247 DEBUG nova.compute.manager [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:15:58 compute-0 nova_compute[187243]: 2025-12-03 00:15:58.942 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:59 compute-0 nova_compute[187243]: 2025-12-03 00:15:59.131 187247 DEBUG nova.compute.manager [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:15:59 compute-0 nova_compute[187243]: 2025-12-03 00:15:59.132 187247 DEBUG nova.network.neutron [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:15:59 compute-0 nova_compute[187243]: 2025-12-03 00:15:59.133 187247 WARNING neutronclient.v2_0.client [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:59 compute-0 nova_compute[187243]: 2025-12-03 00:15:59.133 187247 WARNING neutronclient.v2_0.client [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:15:59 compute-0 nova_compute[187243]: 2025-12-03 00:15:59.639 187247 INFO nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:15:59 compute-0 podman[197600]: time="2025-12-03T00:15:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:15:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:15:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:15:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:15:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3064 "" "Go-http-client/1.1"
Dec 03 00:15:59 compute-0 nova_compute[187243]: 2025-12-03 00:15:59.978 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:15:59 compute-0 ovn_controller[95488]: 2025-12-03T00:15:59Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e6:86:86 10.100.0.3
Dec 03 00:15:59 compute-0 ovn_controller[95488]: 2025-12-03T00:15:59Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e6:86:86 10.100.0.3
Dec 03 00:16:00 compute-0 podman[218931]: 2025-12-03 00:16:00.096289909 +0000 UTC m=+0.052201894 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:16:00 compute-0 nova_compute[187243]: 2025-12-03 00:16:00.148 187247 DEBUG nova.compute.manager [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:16:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:00.712 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:00.713 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:00.713 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:00 compute-0 nova_compute[187243]: 2025-12-03 00:16:00.982 187247 DEBUG nova.network.neutron [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Successfully created port: 7c86d125-a1e0-4d28-bccb-7ba5562c31e9 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.167 187247 DEBUG nova.compute.manager [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.168 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.169 187247 INFO nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Creating image(s)
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.169 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "/var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.169 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "/var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.170 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "/var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.170 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.173 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.175 187247 DEBUG oslo_concurrency.processutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.226 187247 DEBUG oslo_concurrency.processutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.227 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.228 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.228 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.232 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.232 187247 DEBUG oslo_concurrency.processutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.283 187247 DEBUG oslo_concurrency.processutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.284 187247 DEBUG oslo_concurrency.processutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.316 187247 DEBUG oslo_concurrency.processutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.317 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.089s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.317 187247 DEBUG oslo_concurrency.processutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.367 187247 DEBUG oslo_concurrency.processutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.368 187247 DEBUG nova.virt.disk.api [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Checking if we can resize image /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.368 187247 DEBUG oslo_concurrency.processutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:01 compute-0 openstack_network_exporter[199746]: ERROR   00:16:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:16:01 compute-0 openstack_network_exporter[199746]: ERROR   00:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:16:01 compute-0 openstack_network_exporter[199746]: ERROR   00:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:16:01 compute-0 openstack_network_exporter[199746]: ERROR   00:16:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:16:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:16:01 compute-0 openstack_network_exporter[199746]: ERROR   00:16:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:16:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.420 187247 DEBUG oslo_concurrency.processutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.421 187247 DEBUG nova.virt.disk.api [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Cannot resize image /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.421 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.422 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Ensure instance console log exists: /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.422 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.422 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.423 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.528 187247 DEBUG nova.network.neutron [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Successfully updated port: 7c86d125-a1e0-4d28-bccb-7ba5562c31e9 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.612 187247 DEBUG nova.compute.manager [req-4f459ce0-4b8d-46f6-a60d-daf16a4c6c8f req-9194da35-4d4d-46e6-9f91-ed5274da1ab6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Received event network-changed-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.613 187247 DEBUG nova.compute.manager [req-4f459ce0-4b8d-46f6-a60d-daf16a4c6c8f req-9194da35-4d4d-46e6-9f91-ed5274da1ab6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Refreshing instance network info cache due to event network-changed-7c86d125-a1e0-4d28-bccb-7ba5562c31e9. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.613 187247 DEBUG oslo_concurrency.lockutils [req-4f459ce0-4b8d-46f6-a60d-daf16a4c6c8f req-9194da35-4d4d-46e6-9f91-ed5274da1ab6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-72f66e11-43aa-4598-95a3-697bee26b5e1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.613 187247 DEBUG oslo_concurrency.lockutils [req-4f459ce0-4b8d-46f6-a60d-daf16a4c6c8f req-9194da35-4d4d-46e6-9f91-ed5274da1ab6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-72f66e11-43aa-4598-95a3-697bee26b5e1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:16:01 compute-0 nova_compute[187243]: 2025-12-03 00:16:01.613 187247 DEBUG nova.network.neutron [req-4f459ce0-4b8d-46f6-a60d-daf16a4c6c8f req-9194da35-4d4d-46e6-9f91-ed5274da1ab6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Refreshing network info cache for port 7c86d125-a1e0-4d28-bccb-7ba5562c31e9 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:16:02 compute-0 nova_compute[187243]: 2025-12-03 00:16:02.034 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "refresh_cache-72f66e11-43aa-4598-95a3-697bee26b5e1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:16:02 compute-0 nova_compute[187243]: 2025-12-03 00:16:02.118 187247 WARNING neutronclient.v2_0.client [req-4f459ce0-4b8d-46f6-a60d-daf16a4c6c8f req-9194da35-4d4d-46e6-9f91-ed5274da1ab6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:02 compute-0 nova_compute[187243]: 2025-12-03 00:16:02.631 187247 DEBUG nova.network.neutron [req-4f459ce0-4b8d-46f6-a60d-daf16a4c6c8f req-9194da35-4d4d-46e6-9f91-ed5274da1ab6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:16:02 compute-0 nova_compute[187243]: 2025-12-03 00:16:02.813 187247 DEBUG nova.network.neutron [req-4f459ce0-4b8d-46f6-a60d-daf16a4c6c8f req-9194da35-4d4d-46e6-9f91-ed5274da1ab6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:16:03 compute-0 podman[218971]: 2025-12-03 00:16:03.119314108 +0000 UTC m=+0.066908538 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 00:16:03 compute-0 podman[218972]: 2025-12-03 00:16:03.146732277 +0000 UTC m=+0.100899750 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:16:03 compute-0 nova_compute[187243]: 2025-12-03 00:16:03.320 187247 DEBUG oslo_concurrency.lockutils [req-4f459ce0-4b8d-46f6-a60d-daf16a4c6c8f req-9194da35-4d4d-46e6-9f91-ed5274da1ab6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-72f66e11-43aa-4598-95a3-697bee26b5e1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:16:03 compute-0 nova_compute[187243]: 2025-12-03 00:16:03.321 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquired lock "refresh_cache-72f66e11-43aa-4598-95a3-697bee26b5e1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:16:03 compute-0 nova_compute[187243]: 2025-12-03 00:16:03.321 187247 DEBUG nova.network.neutron [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:16:03 compute-0 nova_compute[187243]: 2025-12-03 00:16:03.944 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:04 compute-0 sshd[128750]: Timeout before authentication for connection from 45.78.222.160 to 38.102.83.77, pid = 217985
Dec 03 00:16:04 compute-0 nova_compute[187243]: 2025-12-03 00:16:04.295 187247 DEBUG nova.network.neutron [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:16:04 compute-0 nova_compute[187243]: 2025-12-03 00:16:04.487 187247 WARNING neutronclient.v2_0.client [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:04 compute-0 nova_compute[187243]: 2025-12-03 00:16:04.651 187247 DEBUG nova.network.neutron [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Updating instance_info_cache with network_info: [{"id": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "address": "fa:16:3e:9e:f8:25", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c86d125-a1", "ovs_interfaceid": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:16:04 compute-0 nova_compute[187243]: 2025-12-03 00:16:04.980 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.158 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Releasing lock "refresh_cache-72f66e11-43aa-4598-95a3-697bee26b5e1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.159 187247 DEBUG nova.compute.manager [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Instance network_info: |[{"id": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "address": "fa:16:3e:9e:f8:25", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c86d125-a1", "ovs_interfaceid": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.162 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Start _get_guest_xml network_info=[{"id": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "address": "fa:16:3e:9e:f8:25", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c86d125-a1", "ovs_interfaceid": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.167 187247 WARNING nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.168 187247 DEBUG nova.virt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalanceStrategy-server-1081576432', uuid='72f66e11-43aa-4598-95a3-697bee26b5e1'), owner=OwnerMeta(userid='0473307cd38b412cbfdbd093053eb1af', username='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin', projectid='e510a0888b4c4fb5860a0f1720b8ed4b', projectname='tempest-TestExecuteWorkloadBalanceStrategy-1290727110'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='tempest-watcher_flavor-1545796181', flavorid='961ca853-f9ec-479e-bfb6-9bdd23ae3e33', memory_mb=1151, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={}, swap=0), network_info=[{"id": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "address": "fa:16:3e:9e:f8:25", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c86d125-a1", "ovs_interfaceid": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764720965.1687093) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.180 187247 DEBUG nova.virt.libvirt.host [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.180 187247 DEBUG nova.virt.libvirt.host [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.193 187247 DEBUG nova.virt.libvirt.host [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.194 187247 DEBUG nova.virt.libvirt.host [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.195 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.195 187247 DEBUG nova.virt.hardware [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T00:15:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='961ca853-f9ec-479e-bfb6-9bdd23ae3e33',id=3,is_public=True,memory_mb=1151,name='tempest-watcher_flavor-1545796181',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.196 187247 DEBUG nova.virt.hardware [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.196 187247 DEBUG nova.virt.hardware [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.196 187247 DEBUG nova.virt.hardware [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.196 187247 DEBUG nova.virt.hardware [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.197 187247 DEBUG nova.virt.hardware [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.197 187247 DEBUG nova.virt.hardware [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.197 187247 DEBUG nova.virt.hardware [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.197 187247 DEBUG nova.virt.hardware [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.198 187247 DEBUG nova.virt.hardware [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.198 187247 DEBUG nova.virt.hardware [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.202 187247 DEBUG nova.virt.libvirt.vif [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:15:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1081576432',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1081576432',id=25,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-9bdy2tss',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:16:00Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=72f66e11-43aa-4598-95a3-697bee26b5e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "address": "fa:16:3e:9e:f8:25", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c86d125-a1", "ovs_interfaceid": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.202 187247 DEBUG nova.network.os_vif_util [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converting VIF {"id": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "address": "fa:16:3e:9e:f8:25", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c86d125-a1", "ovs_interfaceid": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.203 187247 DEBUG nova.network.os_vif_util [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:f8:25,bridge_name='br-int',has_traffic_filtering=True,id=7c86d125-a1e0-4d28-bccb-7ba5562c31e9,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c86d125-a1') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.203 187247 DEBUG nova.objects.instance [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lazy-loading 'pci_devices' on Instance uuid 72f66e11-43aa-4598-95a3-697bee26b5e1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.726 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:16:05 compute-0 nova_compute[187243]:   <uuid>72f66e11-43aa-4598-95a3-697bee26b5e1</uuid>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   <name>instance-00000019</name>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   <memory>1178624</memory>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1081576432</nova:name>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:16:05</nova:creationTime>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <nova:flavor name="tempest-watcher_flavor-1545796181" id="961ca853-f9ec-479e-bfb6-9bdd23ae3e33">
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:memory>1151</nova:memory>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:extraSpecs/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:16:05 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:user uuid="0473307cd38b412cbfdbd093053eb1af">tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin</nova:user>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:project uuid="e510a0888b4c4fb5860a0f1720b8ed4b">tempest-TestExecuteWorkloadBalanceStrategy-1290727110</nova:project>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         <nova:port uuid="7c86d125-a1e0-4d28-bccb-7ba5562c31e9">
Dec 03 00:16:05 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <system>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <entry name="serial">72f66e11-43aa-4598-95a3-697bee26b5e1</entry>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <entry name="uuid">72f66e11-43aa-4598-95a3-697bee26b5e1</entry>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     </system>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   <os>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   </os>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   <features>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   </features>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk.config"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:9e:f8:25"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <target dev="tap7c86d125-a1"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/console.log" append="off"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <video>
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     </video>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:16:05 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:16:05 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:16:05 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:16:05 compute-0 nova_compute[187243]: </domain>
Dec 03 00:16:05 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.728 187247 DEBUG nova.compute.manager [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Preparing to wait for external event network-vif-plugged-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.728 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.729 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.729 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.730 187247 DEBUG nova.virt.libvirt.vif [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:15:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1081576432',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1081576432',id=25,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-9bdy2tss',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:16:00Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=72f66e11-43aa-4598-95a3-697bee26b5e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "address": "fa:16:3e:9e:f8:25", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c86d125-a1", "ovs_interfaceid": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.730 187247 DEBUG nova.network.os_vif_util [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converting VIF {"id": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "address": "fa:16:3e:9e:f8:25", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c86d125-a1", "ovs_interfaceid": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.731 187247 DEBUG nova.network.os_vif_util [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:f8:25,bridge_name='br-int',has_traffic_filtering=True,id=7c86d125-a1e0-4d28-bccb-7ba5562c31e9,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c86d125-a1') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.731 187247 DEBUG os_vif [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:f8:25,bridge_name='br-int',has_traffic_filtering=True,id=7c86d125-a1e0-4d28-bccb-7ba5562c31e9,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c86d125-a1') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.732 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.732 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.732 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.733 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.733 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd929e911-cf5a-5fb4-98b8-023d9f000ec6', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.734 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.736 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.739 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.739 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7c86d125-a1, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.739 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap7c86d125-a1, col_values=(('qos', UUID('19e3bebe-28a5-46a6-8ada-8e54d4976047')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.740 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap7c86d125-a1, col_values=(('external_ids', {'iface-id': '7c86d125-a1e0-4d28-bccb-7ba5562c31e9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9e:f8:25', 'vm-uuid': '72f66e11-43aa-4598-95a3-697bee26b5e1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.741 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:05 compute-0 NetworkManager[55671]: <info>  [1764720965.7419] manager: (tap7c86d125-a1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.743 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.748 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:05 compute-0 nova_compute[187243]: 2025-12-03 00:16:05.749 187247 INFO os_vif [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:f8:25,bridge_name='br-int',has_traffic_filtering=True,id=7c86d125-a1e0-4d28-bccb-7ba5562c31e9,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c86d125-a1')
Dec 03 00:16:07 compute-0 nova_compute[187243]: 2025-12-03 00:16:07.286 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:16:07 compute-0 nova_compute[187243]: 2025-12-03 00:16:07.287 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:16:07 compute-0 nova_compute[187243]: 2025-12-03 00:16:07.287 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] No VIF found with MAC fa:16:3e:9e:f8:25, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:16:07 compute-0 nova_compute[187243]: 2025-12-03 00:16:07.287 187247 INFO nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Using config drive
Dec 03 00:16:07 compute-0 nova_compute[187243]: 2025-12-03 00:16:07.801 187247 WARNING neutronclient.v2_0.client [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:07 compute-0 nova_compute[187243]: 2025-12-03 00:16:07.994 187247 INFO nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Creating config drive at /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk.config
Dec 03 00:16:08 compute-0 nova_compute[187243]: 2025-12-03 00:16:07.999 187247 DEBUG oslo_concurrency.processutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp5u2mf1i_ execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:08 compute-0 nova_compute[187243]: 2025-12-03 00:16:08.124 187247 DEBUG oslo_concurrency.processutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp5u2mf1i_" returned: 0 in 0.125s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:08 compute-0 kernel: tap7c86d125-a1: entered promiscuous mode
Dec 03 00:16:08 compute-0 NetworkManager[55671]: <info>  [1764720968.1908] manager: (tap7c86d125-a1): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Dec 03 00:16:08 compute-0 ovn_controller[95488]: 2025-12-03T00:16:08Z|00171|binding|INFO|Claiming lport 7c86d125-a1e0-4d28-bccb-7ba5562c31e9 for this chassis.
Dec 03 00:16:08 compute-0 ovn_controller[95488]: 2025-12-03T00:16:08Z|00172|binding|INFO|7c86d125-a1e0-4d28-bccb-7ba5562c31e9: Claiming fa:16:3e:9e:f8:25 10.100.0.11
Dec 03 00:16:08 compute-0 nova_compute[187243]: 2025-12-03 00:16:08.192 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.202 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:f8:25 10.100.0.11'], port_security=['fa:16:3e:9e:f8:25 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '72f66e11-43aa-4598-95a3-697bee26b5e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=7c86d125-a1e0-4d28-bccb-7ba5562c31e9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.204 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 7c86d125-a1e0-4d28-bccb-7ba5562c31e9 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd bound to our chassis
Dec 03 00:16:08 compute-0 ovn_controller[95488]: 2025-12-03T00:16:08Z|00173|binding|INFO|Setting lport 7c86d125-a1e0-4d28-bccb-7ba5562c31e9 ovn-installed in OVS
Dec 03 00:16:08 compute-0 ovn_controller[95488]: 2025-12-03T00:16:08Z|00174|binding|INFO|Setting lport 7c86d125-a1e0-4d28-bccb-7ba5562c31e9 up in Southbound
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.207 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:16:08 compute-0 nova_compute[187243]: 2025-12-03 00:16:08.207 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:08 compute-0 nova_compute[187243]: 2025-12-03 00:16:08.209 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.224 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ba860779-4361-4b1a-aadc-5de15ff5ce78]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:08 compute-0 systemd-udevd[219035]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:16:08 compute-0 systemd-machined[153518]: New machine qemu-16-instance-00000019.
Dec 03 00:16:08 compute-0 NetworkManager[55671]: <info>  [1764720968.2403] device (tap7c86d125-a1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:16:08 compute-0 NetworkManager[55671]: <info>  [1764720968.2420] device (tap7c86d125-a1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:16:08 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000019.
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.255 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[263b92f0-4e61-4f2c-b6c0-cc2fb5e20154]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.257 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[9d478062-f9de-4408-8b92-ce2953d012cd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.289 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[58535139-4a55-4333-9481-68b3b30cd708]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.307 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[73cef786-85f4-4524-b555-317890aee11a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503837, 'reachable_time': 40641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219048, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.321 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0b470776-22fe-48b1-8b97-cf05ea0b3507]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee60e03c-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503846, 'tstamp': 503846}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219050, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee60e03c-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503850, 'tstamp': 503850}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219050, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.323 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:08 compute-0 nova_compute[187243]: 2025-12-03 00:16:08.324 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:08 compute-0 nova_compute[187243]: 2025-12-03 00:16:08.325 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.325 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee60e03c-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.325 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.326 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee60e03c-a0, col_values=(('external_ids', {'iface-id': '42f0d9e7-7c77-4247-8972-6beac3a53206'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.326 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:16:08 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:08.327 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fc3861d3-fb9e-4bc7-8f30-c66d59fa0fc1]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ee60e03c-ab3a-419f-84ef-62aec4b6b0dd\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:08 compute-0 nova_compute[187243]: 2025-12-03 00:16:08.946 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:08 compute-0 nova_compute[187243]: 2025-12-03 00:16:08.960 187247 DEBUG nova.compute.manager [req-5eb86866-3c32-4c64-991f-80bbd7603752 req-63e75272-ffd4-44e2-80dd-01b2e17b871d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Received event network-vif-plugged-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:08 compute-0 nova_compute[187243]: 2025-12-03 00:16:08.961 187247 DEBUG oslo_concurrency.lockutils [req-5eb86866-3c32-4c64-991f-80bbd7603752 req-63e75272-ffd4-44e2-80dd-01b2e17b871d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:08 compute-0 nova_compute[187243]: 2025-12-03 00:16:08.961 187247 DEBUG oslo_concurrency.lockutils [req-5eb86866-3c32-4c64-991f-80bbd7603752 req-63e75272-ffd4-44e2-80dd-01b2e17b871d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:08 compute-0 nova_compute[187243]: 2025-12-03 00:16:08.961 187247 DEBUG oslo_concurrency.lockutils [req-5eb86866-3c32-4c64-991f-80bbd7603752 req-63e75272-ffd4-44e2-80dd-01b2e17b871d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:08 compute-0 nova_compute[187243]: 2025-12-03 00:16:08.962 187247 DEBUG nova.compute.manager [req-5eb86866-3c32-4c64-991f-80bbd7603752 req-63e75272-ffd4-44e2-80dd-01b2e17b871d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Processing event network-vif-plugged-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:16:09 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:09.065 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:16:09 compute-0 nova_compute[187243]: 2025-12-03 00:16:09.066 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:09 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:09.066 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:16:09 compute-0 nova_compute[187243]: 2025-12-03 00:16:09.715 187247 DEBUG nova.compute.manager [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:16:09 compute-0 nova_compute[187243]: 2025-12-03 00:16:09.718 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:16:09 compute-0 nova_compute[187243]: 2025-12-03 00:16:09.721 187247 INFO nova.virt.libvirt.driver [-] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Instance spawned successfully.
Dec 03 00:16:09 compute-0 nova_compute[187243]: 2025-12-03 00:16:09.722 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:16:10 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:10.067 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:10 compute-0 nova_compute[187243]: 2025-12-03 00:16:10.233 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:16:10 compute-0 nova_compute[187243]: 2025-12-03 00:16:10.234 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:16:10 compute-0 nova_compute[187243]: 2025-12-03 00:16:10.235 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:16:10 compute-0 nova_compute[187243]: 2025-12-03 00:16:10.235 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:16:10 compute-0 nova_compute[187243]: 2025-12-03 00:16:10.236 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:16:10 compute-0 nova_compute[187243]: 2025-12-03 00:16:10.236 187247 DEBUG nova.virt.libvirt.driver [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:16:10 compute-0 nova_compute[187243]: 2025-12-03 00:16:10.760 187247 INFO nova.compute.manager [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Took 9.59 seconds to spawn the instance on the hypervisor.
Dec 03 00:16:10 compute-0 nova_compute[187243]: 2025-12-03 00:16:10.761 187247 DEBUG nova.compute.manager [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:16:10 compute-0 nova_compute[187243]: 2025-12-03 00:16:10.770 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:11 compute-0 nova_compute[187243]: 2025-12-03 00:16:11.020 187247 DEBUG nova.compute.manager [req-167d7803-0ac0-4bb7-b35d-7b417ab65932 req-6703235c-5766-43b0-a781-1f53fdba992f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Received event network-vif-plugged-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:11 compute-0 nova_compute[187243]: 2025-12-03 00:16:11.020 187247 DEBUG oslo_concurrency.lockutils [req-167d7803-0ac0-4bb7-b35d-7b417ab65932 req-6703235c-5766-43b0-a781-1f53fdba992f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:11 compute-0 nova_compute[187243]: 2025-12-03 00:16:11.020 187247 DEBUG oslo_concurrency.lockutils [req-167d7803-0ac0-4bb7-b35d-7b417ab65932 req-6703235c-5766-43b0-a781-1f53fdba992f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:11 compute-0 nova_compute[187243]: 2025-12-03 00:16:11.021 187247 DEBUG oslo_concurrency.lockutils [req-167d7803-0ac0-4bb7-b35d-7b417ab65932 req-6703235c-5766-43b0-a781-1f53fdba992f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:11 compute-0 nova_compute[187243]: 2025-12-03 00:16:11.021 187247 DEBUG nova.compute.manager [req-167d7803-0ac0-4bb7-b35d-7b417ab65932 req-6703235c-5766-43b0-a781-1f53fdba992f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] No waiting events found dispatching network-vif-plugged-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:16:11 compute-0 nova_compute[187243]: 2025-12-03 00:16:11.021 187247 WARNING nova.compute.manager [req-167d7803-0ac0-4bb7-b35d-7b417ab65932 req-6703235c-5766-43b0-a781-1f53fdba992f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Received unexpected event network-vif-plugged-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 for instance with vm_state active and task_state None.
Dec 03 00:16:11 compute-0 nova_compute[187243]: 2025-12-03 00:16:11.421 187247 INFO nova.compute.manager [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Took 15.48 seconds to build instance.
Dec 03 00:16:11 compute-0 nova_compute[187243]: 2025-12-03 00:16:11.926 187247 DEBUG oslo_concurrency.lockutils [None req-57744fb6-d0ad-4611-b15f-2bb7286b4852 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.997s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:13 compute-0 nova_compute[187243]: 2025-12-03 00:16:13.950 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:15 compute-0 nova_compute[187243]: 2025-12-03 00:16:15.772 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:16 compute-0 sshd-session[219059]: Invalid user dd from 102.210.148.92 port 55726
Dec 03 00:16:16 compute-0 sshd-session[219059]: Received disconnect from 102.210.148.92 port 55726:11: Bye Bye [preauth]
Dec 03 00:16:16 compute-0 sshd-session[219059]: Disconnected from invalid user dd 102.210.148.92 port 55726 [preauth]
Dec 03 00:16:16 compute-0 sshd[128750]: drop connection #0 from [45.78.222.160]:34722 on [38.102.83.77]:22 penalty: exceeded LoginGraceTime
Dec 03 00:16:18 compute-0 podman[219061]: 2025-12-03 00:16:18.137503592 +0000 UTC m=+0.077747706 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git)
Dec 03 00:16:18 compute-0 nova_compute[187243]: 2025-12-03 00:16:18.952 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:19 compute-0 sshd-session[219083]: Invalid user casaos from 20.123.120.169 port 56946
Dec 03 00:16:19 compute-0 podman[219085]: 2025-12-03 00:16:19.945235247 +0000 UTC m=+0.063105754 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.vendor=CentOS)
Dec 03 00:16:19 compute-0 sshd-session[219083]: Received disconnect from 20.123.120.169 port 56946:11: Bye Bye [preauth]
Dec 03 00:16:19 compute-0 sshd-session[219083]: Disconnected from invalid user casaos 20.123.120.169 port 56946 [preauth]
Dec 03 00:16:20 compute-0 nova_compute[187243]: 2025-12-03 00:16:20.774 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:23 compute-0 nova_compute[187243]: 2025-12-03 00:16:23.954 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:24 compute-0 ovn_controller[95488]: 2025-12-03T00:16:24Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9e:f8:25 10.100.0.11
Dec 03 00:16:24 compute-0 ovn_controller[95488]: 2025-12-03T00:16:24Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9e:f8:25 10.100.0.11
Dec 03 00:16:25 compute-0 sshd-session[219120]: Received disconnect from 23.95.37.90 port 48206:11: Bye Bye [preauth]
Dec 03 00:16:25 compute-0 sshd-session[219120]: Disconnected from authenticating user root 23.95.37.90 port 48206 [preauth]
Dec 03 00:16:25 compute-0 nova_compute[187243]: 2025-12-03 00:16:25.776 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:28 compute-0 nova_compute[187243]: 2025-12-03 00:16:28.956 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:29 compute-0 podman[197600]: time="2025-12-03T00:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:16:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:16:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3062 "" "Go-http-client/1.1"
Dec 03 00:16:29 compute-0 sshd-session[219122]: Received disconnect from 49.247.36.49 port 6182:11: Bye Bye [preauth]
Dec 03 00:16:29 compute-0 sshd-session[219122]: Disconnected from authenticating user root 49.247.36.49 port 6182 [preauth]
Dec 03 00:16:30 compute-0 nova_compute[187243]: 2025-12-03 00:16:30.651 187247 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Check if temp file /var/lib/nova/instances/tmpzam0o4uu exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:16:30 compute-0 nova_compute[187243]: 2025-12-03 00:16:30.656 187247 DEBUG nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzam0o4uu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1df5044-c7ad-42e6-93bd-4b5a853ab3b8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:16:30 compute-0 nova_compute[187243]: 2025-12-03 00:16:30.778 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:31 compute-0 podman[219124]: 2025-12-03 00:16:31.085713463 +0000 UTC m=+0.042262887 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:16:31 compute-0 openstack_network_exporter[199746]: ERROR   00:16:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:16:31 compute-0 openstack_network_exporter[199746]: ERROR   00:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:16:31 compute-0 openstack_network_exporter[199746]: ERROR   00:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:16:31 compute-0 openstack_network_exporter[199746]: ERROR   00:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:16:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:16:31 compute-0 openstack_network_exporter[199746]: ERROR   00:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:16:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:16:33 compute-0 nova_compute[187243]: 2025-12-03 00:16:33.959 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:34 compute-0 podman[219149]: 2025-12-03 00:16:34.117984121 +0000 UTC m=+0.066659540 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 03 00:16:34 compute-0 podman[219150]: 2025-12-03 00:16:34.131791663 +0000 UTC m=+0.085033738 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202)
Dec 03 00:16:35 compute-0 nova_compute[187243]: 2025-12-03 00:16:35.329 187247 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:35 compute-0 nova_compute[187243]: 2025-12-03 00:16:35.387 187247 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:35 compute-0 nova_compute[187243]: 2025-12-03 00:16:35.388 187247 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:35 compute-0 nova_compute[187243]: 2025-12-03 00:16:35.440 187247 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:35 compute-0 nova_compute[187243]: 2025-12-03 00:16:35.441 187247 DEBUG nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Preparing to wait for external event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:16:35 compute-0 nova_compute[187243]: 2025-12-03 00:16:35.442 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:35 compute-0 nova_compute[187243]: 2025-12-03 00:16:35.442 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:35 compute-0 nova_compute[187243]: 2025-12-03 00:16:35.442 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:35 compute-0 nova_compute[187243]: 2025-12-03 00:16:35.780 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:38 compute-0 ovn_controller[95488]: 2025-12-03T00:16:38Z|00175|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 03 00:16:38 compute-0 nova_compute[187243]: 2025-12-03 00:16:38.960 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:40 compute-0 nova_compute[187243]: 2025-12-03 00:16:40.783 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:41 compute-0 nova_compute[187243]: 2025-12-03 00:16:41.187 187247 DEBUG nova.compute.manager [req-2fcea2dc-68a3-46a7-b17e-da401ef7f153 req-3947a6be-7126-4427-a5c0-ed90d3dd9452 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:41 compute-0 nova_compute[187243]: 2025-12-03 00:16:41.187 187247 DEBUG oslo_concurrency.lockutils [req-2fcea2dc-68a3-46a7-b17e-da401ef7f153 req-3947a6be-7126-4427-a5c0-ed90d3dd9452 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:41 compute-0 nova_compute[187243]: 2025-12-03 00:16:41.188 187247 DEBUG oslo_concurrency.lockutils [req-2fcea2dc-68a3-46a7-b17e-da401ef7f153 req-3947a6be-7126-4427-a5c0-ed90d3dd9452 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:41 compute-0 nova_compute[187243]: 2025-12-03 00:16:41.188 187247 DEBUG oslo_concurrency.lockutils [req-2fcea2dc-68a3-46a7-b17e-da401ef7f153 req-3947a6be-7126-4427-a5c0-ed90d3dd9452 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:41 compute-0 nova_compute[187243]: 2025-12-03 00:16:41.188 187247 DEBUG nova.compute.manager [req-2fcea2dc-68a3-46a7-b17e-da401ef7f153 req-3947a6be-7126-4427-a5c0-ed90d3dd9452 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No event matching network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 in dict_keys([('network-vif-plugged', '75f7bf8b-141c-44e2-be3c-1fdae9af1077')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:16:41 compute-0 nova_compute[187243]: 2025-12-03 00:16:41.189 187247 DEBUG nova.compute.manager [req-2fcea2dc-68a3-46a7-b17e-da401ef7f153 req-3947a6be-7126-4427-a5c0-ed90d3dd9452 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.235 187247 DEBUG nova.compute.manager [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.236 187247 DEBUG oslo_concurrency.lockutils [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.236 187247 DEBUG oslo_concurrency.lockutils [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.236 187247 DEBUG oslo_concurrency.lockutils [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.236 187247 DEBUG nova.compute.manager [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Processing event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.236 187247 DEBUG nova.compute.manager [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-changed-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.237 187247 DEBUG nova.compute.manager [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Refreshing instance network info cache due to event network-changed-75f7bf8b-141c-44e2-be3c-1fdae9af1077. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.237 187247 DEBUG oslo_concurrency.lockutils [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.237 187247 DEBUG oslo_concurrency.lockutils [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.237 187247 DEBUG nova.network.neutron [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Refreshing network info cache for port 75f7bf8b-141c-44e2-be3c-1fdae9af1077 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.744 187247 WARNING neutronclient.v2_0.client [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.963 187247 INFO nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Took 8.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.964 187247 DEBUG nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:16:43 compute-0 nova_compute[187243]: 2025-12-03 00:16:43.967 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:44 compute-0 nova_compute[187243]: 2025-12-03 00:16:44.474 187247 DEBUG nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzam0o4uu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1df5044-c7ad-42e6-93bd-4b5a853ab3b8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(1a09d0f0-d7a1-422a-acd5-e6a0723f1ba4),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:16:44 compute-0 nova_compute[187243]: 2025-12-03 00:16:44.733 187247 WARNING neutronclient.v2_0.client [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:44 compute-0 nova_compute[187243]: 2025-12-03 00:16:44.987 187247 DEBUG nova.objects.instance [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid c1df5044-c7ad-42e6-93bd-4b5a853ab3b8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:16:44 compute-0 nova_compute[187243]: 2025-12-03 00:16:44.988 187247 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:16:44 compute-0 nova_compute[187243]: 2025-12-03 00:16:44.990 187247 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:16:44 compute-0 nova_compute[187243]: 2025-12-03 00:16:44.990 187247 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.492 187247 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.493 187247 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.498 187247 DEBUG nova.virt.libvirt.vif [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:15:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-984503060',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-984503060',id=24,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:15:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-4tk0mv8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:15:49Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=c1df5044-c7ad-42e6-93bd-4b5a853ab3b8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.498 187247 DEBUG nova.network.os_vif_util [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.499 187247 DEBUG nova.network.os_vif_util [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.499 187247 DEBUG nova.virt.libvirt.migration [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:e6:86:86"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <target dev="tap75f7bf8b-14"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]: </interface>
Dec 03 00:16:45 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.500 187247 DEBUG nova.virt.libvirt.migration [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <name>instance-00000018</name>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <uuid>c1df5044-c7ad-42e6-93bd-4b5a853ab3b8</uuid>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-984503060</nova:name>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:15:44</nova:creationTime>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:flavor name="tempest-watcher_flavor-1545796181" id="961ca853-f9ec-479e-bfb6-9bdd23ae3e33">
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:memory>1151</nova:memory>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:extraSpecs/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:16:45 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:user uuid="0473307cd38b412cbfdbd093053eb1af">tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin</nova:user>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:project uuid="e510a0888b4c4fb5860a0f1720b8ed4b">tempest-TestExecuteWorkloadBalanceStrategy-1290727110</nova:project>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:port uuid="75f7bf8b-141c-44e2-be3c-1fdae9af1077">
Dec 03 00:16:45 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <memory unit="KiB">1178624</memory>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">1178624</currentMemory>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <system>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="serial">c1df5044-c7ad-42e6-93bd-4b5a853ab3b8</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="uuid">c1df5044-c7ad-42e6-93bd-4b5a853ab3b8</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </system>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <os>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </os>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <features>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </features>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk.config"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:e6:86:86"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap75f7bf8b-14"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/console.log" append="off"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </target>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/console.log" append="off"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </console>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </input>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <video>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </video>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]: </domain>
Dec 03 00:16:45 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.501 187247 DEBUG nova.virt.libvirt.migration [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <name>instance-00000018</name>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <uuid>c1df5044-c7ad-42e6-93bd-4b5a853ab3b8</uuid>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-984503060</nova:name>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:15:44</nova:creationTime>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:flavor name="tempest-watcher_flavor-1545796181" id="961ca853-f9ec-479e-bfb6-9bdd23ae3e33">
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:memory>1151</nova:memory>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:extraSpecs/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:16:45 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:user uuid="0473307cd38b412cbfdbd093053eb1af">tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin</nova:user>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:project uuid="e510a0888b4c4fb5860a0f1720b8ed4b">tempest-TestExecuteWorkloadBalanceStrategy-1290727110</nova:project>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:port uuid="75f7bf8b-141c-44e2-be3c-1fdae9af1077">
Dec 03 00:16:45 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <memory unit="KiB">1178624</memory>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">1178624</currentMemory>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <system>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="serial">c1df5044-c7ad-42e6-93bd-4b5a853ab3b8</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="uuid">c1df5044-c7ad-42e6-93bd-4b5a853ab3b8</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </system>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <os>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </os>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <features>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </features>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk.config"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:e6:86:86"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap75f7bf8b-14"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/console.log" append="off"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </target>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/console.log" append="off"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </console>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </input>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <video>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </video>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]: </domain>
Dec 03 00:16:45 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.502 187247 DEBUG nova.virt.libvirt.migration [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <name>instance-00000018</name>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <uuid>c1df5044-c7ad-42e6-93bd-4b5a853ab3b8</uuid>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-984503060</nova:name>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:15:44</nova:creationTime>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:flavor name="tempest-watcher_flavor-1545796181" id="961ca853-f9ec-479e-bfb6-9bdd23ae3e33">
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:memory>1151</nova:memory>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:extraSpecs/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:16:45 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:user uuid="0473307cd38b412cbfdbd093053eb1af">tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin</nova:user>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:project uuid="e510a0888b4c4fb5860a0f1720b8ed4b">tempest-TestExecuteWorkloadBalanceStrategy-1290727110</nova:project>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <nova:port uuid="75f7bf8b-141c-44e2-be3c-1fdae9af1077">
Dec 03 00:16:45 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <memory unit="KiB">1178624</memory>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">1178624</currentMemory>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <system>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="serial">c1df5044-c7ad-42e6-93bd-4b5a853ab3b8</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="uuid">c1df5044-c7ad-42e6-93bd-4b5a853ab3b8</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </system>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <os>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </os>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <features>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </features>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/disk.config"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:e6:86:86"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap75f7bf8b-14"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/console.log" append="off"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:16:45 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       </target>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8/console.log" append="off"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </console>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </input>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <video>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </video>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:16:45 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:16:45 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:16:45 compute-0 nova_compute[187243]: </domain>
Dec 03 00:16:45 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.502 187247 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.784 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.995 187247 DEBUG nova.virt.libvirt.migration [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:16:45 compute-0 nova_compute[187243]: 2025-12-03 00:16:45.995 187247 INFO nova.virt.libvirt.migration [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:16:46 compute-0 nova_compute[187243]: 2025-12-03 00:16:46.732 187247 DEBUG nova.network.neutron [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Updated VIF entry in instance network info cache for port 75f7bf8b-141c-44e2-be3c-1fdae9af1077. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:16:46 compute-0 nova_compute[187243]: 2025-12-03 00:16:46.732 187247 DEBUG nova.network.neutron [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Updating instance_info_cache with network_info: [{"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.014 187247 INFO nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.238 187247 DEBUG oslo_concurrency.lockutils [req-0414dc89-68db-4a6f-9ae0-3e9c1d2581ff req-2fd5dce8-0c94-4abd-8ba0-320e949f355d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-c1df5044-c7ad-42e6-93bd-4b5a853ab3b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:16:47 compute-0 kernel: tap75f7bf8b-14 (unregistering): left promiscuous mode
Dec 03 00:16:47 compute-0 NetworkManager[55671]: <info>  [1764721007.5198] device (tap75f7bf8b-14): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.560 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:47 compute-0 ovn_controller[95488]: 2025-12-03T00:16:47Z|00176|binding|INFO|Releasing lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 from this chassis (sb_readonly=0)
Dec 03 00:16:47 compute-0 ovn_controller[95488]: 2025-12-03T00:16:47Z|00177|binding|INFO|Setting lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 down in Southbound
Dec 03 00:16:47 compute-0 ovn_controller[95488]: 2025-12-03T00:16:47Z|00178|binding|INFO|Removing iface tap75f7bf8b-14 ovn-installed in OVS
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.564 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.568 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:86:86 10.100.0.3'], port_security=['fa:16:3e:e6:86:86 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c1df5044-c7ad-42e6-93bd-4b5a853ab3b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=75f7bf8b-141c-44e2-be3c-1fdae9af1077) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.569 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 75f7bf8b-141c-44e2-be3c-1fdae9af1077 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd unbound from our chassis
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.570 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.573 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.586 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[af262276-0243-4a31-babc-e39ed5d5b020]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000018.scope: Deactivated successfully.
Dec 03 00:16:47 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000018.scope: Consumed 14.685s CPU time.
Dec 03 00:16:47 compute-0 systemd-machined[153518]: Machine qemu-15-instance-00000018 terminated.
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.612 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[a1703f48-8bd5-40ef-9bd7-52f7ddfcfd88]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.614 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[7f7ca308-17e6-428f-ab96-34cb75aa509e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.641 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[91c0929d-df17-419d-af8f-56e506d8ccd2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.657 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[bc41d5db-e8e6-4b53-9092-7bc874614c1d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503837, 'reachable_time': 40641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219220, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.670 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4aa5c1c4-6b96-4ab5-a039-c1ef1912a01f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee60e03c-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503846, 'tstamp': 503846}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219221, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee60e03c-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503850, 'tstamp': 503850}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219221, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.671 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.673 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.678 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.678 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee60e03c-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.678 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.678 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee60e03c-a0, col_values=(('external_ids', {'iface-id': '42f0d9e7-7c77-4247-8972-6beac3a53206'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.678 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.680 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0b17e471-8dfb-4cc9-9286-dab663a1f683]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ee60e03c-ab3a-419f-84ef-62aec4b6b0dd\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 kernel: tap75f7bf8b-14: entered promiscuous mode
Dec 03 00:16:47 compute-0 kernel: tap75f7bf8b-14 (unregistering): left promiscuous mode
Dec 03 00:16:47 compute-0 ovn_controller[95488]: 2025-12-03T00:16:47Z|00179|binding|INFO|Claiming lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 for this chassis.
Dec 03 00:16:47 compute-0 ovn_controller[95488]: 2025-12-03T00:16:47Z|00180|binding|INFO|75f7bf8b-141c-44e2-be3c-1fdae9af1077: Claiming fa:16:3e:e6:86:86 10.100.0.3
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.717 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.727 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:86:86 10.100.0.3'], port_security=['fa:16:3e:e6:86:86 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c1df5044-c7ad-42e6-93bd-4b5a853ab3b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=75f7bf8b-141c-44e2-be3c-1fdae9af1077) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.729 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 75f7bf8b-141c-44e2-be3c-1fdae9af1077 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd bound to our chassis
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.731 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:16:47 compute-0 ovn_controller[95488]: 2025-12-03T00:16:47Z|00181|binding|INFO|Setting lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 ovn-installed in OVS
Dec 03 00:16:47 compute-0 ovn_controller[95488]: 2025-12-03T00:16:47Z|00182|binding|INFO|Setting lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 up in Southbound
Dec 03 00:16:47 compute-0 ovn_controller[95488]: 2025-12-03T00:16:47Z|00183|binding|INFO|Releasing lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 from this chassis (sb_readonly=1)
Dec 03 00:16:47 compute-0 ovn_controller[95488]: 2025-12-03T00:16:47Z|00184|if_status|INFO|Not setting lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 down as sb is readonly
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.734 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:47 compute-0 ovn_controller[95488]: 2025-12-03T00:16:47Z|00185|binding|INFO|Removing iface tap75f7bf8b-14 ovn-installed in OVS
Dec 03 00:16:47 compute-0 ovn_controller[95488]: 2025-12-03T00:16:47Z|00186|binding|INFO|Releasing lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 from this chassis (sb_readonly=0)
Dec 03 00:16:47 compute-0 ovn_controller[95488]: 2025-12-03T00:16:47Z|00187|binding|INFO|Setting lport 75f7bf8b-141c-44e2-be3c-1fdae9af1077 down in Southbound
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.745 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.745 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9b4641db-406f-4dc6-91f8-8df9bedce721]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.754 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:86:86 10.100.0.3'], port_security=['fa:16:3e:e6:86:86 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c1df5044-c7ad-42e6-93bd-4b5a853ab3b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=75f7bf8b-141c-44e2-be3c-1fdae9af1077) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.755 187247 DEBUG nova.virt.libvirt.guest [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.755 187247 INFO nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Migration operation has completed
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.756 187247 INFO nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] _post_live_migration() is started..
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.758 187247 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.758 187247 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.758 187247 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.770 187247 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.770 187247 WARNING neutronclient.v2_0.client [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.772 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[0430470b-44e5-46b7-86b0-13bd0c182116]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.774 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[2eaea484-b12c-4688-8c8b-d126b3903ea6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.797 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[665503c3-3137-4129-a0ed-3315800def72]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.814 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[58cfad4b-df14-4323-a6c4-7fe73dfc11f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503837, 'reachable_time': 40641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219238, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.828 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[79a18cc7-bca7-4857-ae26-d9cedf9420ed]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee60e03c-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503846, 'tstamp': 503846}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219239, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee60e03c-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503850, 'tstamp': 503850}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219239, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.829 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.830 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.834 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.834 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee60e03c-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.834 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.835 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee60e03c-a0, col_values=(('external_ids', {'iface-id': '42f0d9e7-7c77-4247-8972-6beac3a53206'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.835 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.836 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1d66245e-e984-43f8-a970-655769de6fcc]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ee60e03c-ab3a-419f-84ef-62aec4b6b0dd\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.837 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 75f7bf8b-141c-44e2-be3c-1fdae9af1077 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd unbound from our chassis
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.838 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.849 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[dcaf8279-9f69-4e01-985e-d41cc0989b8a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.870 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[a0dd7922-3355-4fa1-bff4-9a988d0e639b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.872 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[707bbd9f-9a5a-4b7a-a044-6830541686ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.891 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[ba8715f1-c398-4350-95f7-003a4f66adb0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.906 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1d7528ba-872a-4e4f-b5ca-3ba9a4e44636]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee60e03c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:2b:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503837, 'reachable_time': 40641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219246, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.918 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[013d0352-4c8d-4772-a3d6-a17068d50d26]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee60e03c-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503846, 'tstamp': 503846}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219247, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee60e03c-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503850, 'tstamp': 503850}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219247, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.919 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.920 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.923 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.923 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee60e03c-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.924 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.924 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee60e03c-a0, col_values=(('external_ids', {'iface-id': '42f0d9e7-7c77-4247-8972-6beac3a53206'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.924 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:16:47 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:16:47.925 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e4bbe161-12d2-452e-a8fc-41d5cb0c5e61]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ee60e03c-ab3a-419f-84ef-62aec4b6b0dd\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.977 187247 DEBUG nova.compute.manager [req-6d68fd68-947d-4050-aa2f-90a6cada7634 req-20528b5d-8401-43e9-b075-bba774cc91a9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.978 187247 DEBUG oslo_concurrency.lockutils [req-6d68fd68-947d-4050-aa2f-90a6cada7634 req-20528b5d-8401-43e9-b075-bba774cc91a9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.978 187247 DEBUG oslo_concurrency.lockutils [req-6d68fd68-947d-4050-aa2f-90a6cada7634 req-20528b5d-8401-43e9-b075-bba774cc91a9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.978 187247 DEBUG oslo_concurrency.lockutils [req-6d68fd68-947d-4050-aa2f-90a6cada7634 req-20528b5d-8401-43e9-b075-bba774cc91a9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.979 187247 DEBUG nova.compute.manager [req-6d68fd68-947d-4050-aa2f-90a6cada7634 req-20528b5d-8401-43e9-b075-bba774cc91a9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No waiting events found dispatching network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:16:47 compute-0 nova_compute[187243]: 2025-12-03 00:16:47.979 187247 DEBUG nova.compute.manager [req-6d68fd68-947d-4050-aa2f-90a6cada7634 req-20528b5d-8401-43e9-b075-bba774cc91a9 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.804 187247 DEBUG nova.network.neutron [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port 75f7bf8b-141c-44e2-be3c-1fdae9af1077 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.805 187247 DEBUG nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.805 187247 DEBUG nova.virt.libvirt.vif [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:15:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-984503060',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-984503060',id=24,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:15:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-4tk0mv8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:16:23Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=c1df5044-c7ad-42e6-93bd-4b5a853ab3b8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.805 187247 DEBUG nova.network.os_vif_util [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "address": "fa:16:3e:e6:86:86", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f7bf8b-14", "ovs_interfaceid": "75f7bf8b-141c-44e2-be3c-1fdae9af1077", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.806 187247 DEBUG nova.network.os_vif_util [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.806 187247 DEBUG os_vif [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.808 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.808 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75f7bf8b-14, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.809 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.811 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.812 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.812 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=f2ae0b30-19b3-4498-aa0e-b567e7fb6f77) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.812 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.814 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.817 187247 INFO os_vif [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:86:86,bridge_name='br-int',has_traffic_filtering=True,id=75f7bf8b-141c-44e2-be3c-1fdae9af1077,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f7bf8b-14')
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.817 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.817 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.817 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.818 187247 DEBUG nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.818 187247 INFO nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Deleting instance files /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8_del
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.819 187247 INFO nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Deletion of /var/lib/nova/instances/c1df5044-c7ad-42e6-93bd-4b5a853ab3b8_del complete
Dec 03 00:16:48 compute-0 nova_compute[187243]: 2025-12-03 00:16:48.963 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:49 compute-0 podman[219249]: 2025-12-03 00:16:49.09663426 +0000 UTC m=+0.056230974 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Dec 03 00:16:49 compute-0 nova_compute[187243]: 2025-12-03 00:16:49.102 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:49 compute-0 nova_compute[187243]: 2025-12-03 00:16:49.102 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:49 compute-0 nova_compute[187243]: 2025-12-03 00:16:49.103 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:49 compute-0 nova_compute[187243]: 2025-12-03 00:16:49.103 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.167 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.168 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.168 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.168 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.168 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No waiting events found dispatching network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.169 187247 WARNING nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received unexpected event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with vm_state active and task_state migrating.
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.169 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.169 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.169 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.169 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.170 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No waiting events found dispatching network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.170 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.170 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.170 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.170 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.171 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.171 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No waiting events found dispatching network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.171 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.171 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.171 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.172 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.172 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.172 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No waiting events found dispatching network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.172 187247 WARNING nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received unexpected event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with vm_state active and task_state migrating.
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.172 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.173 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.173 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.173 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.173 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No waiting events found dispatching network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.173 187247 WARNING nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received unexpected event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with vm_state active and task_state migrating.
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.174 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.174 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.174 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.174 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.174 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No waiting events found dispatching network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.175 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.175 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.175 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.175 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.175 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.176 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No waiting events found dispatching network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.176 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-unplugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.176 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.176 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.176 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.176 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.177 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No waiting events found dispatching network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.177 187247 WARNING nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received unexpected event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with vm_state active and task_state migrating.
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.177 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.177 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.177 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.177 187247 DEBUG oslo_concurrency.lockutils [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.178 187247 DEBUG nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] No waiting events found dispatching network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.178 187247 WARNING nova.compute.manager [req-fc7ebea4-15d9-4ce7-82cb-0f5413e3c9f7 req-674745f0-ae8f-48e5-9cf0-f36cc59c2d84 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Received unexpected event network-vif-plugged-75f7bf8b-141c-44e2-be3c-1fdae9af1077 for instance with vm_state active and task_state migrating.
Dec 03 00:16:50 compute-0 podman[219271]: 2025-12-03 00:16:50.262725894 +0000 UTC m=+0.079832356 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd)
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.683 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.737 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.738 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.789 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.912 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.913 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.929 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.930 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5604MB free_disk=73.13357925415039GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.930 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:50 compute-0 nova_compute[187243]: 2025-12-03 00:16:50.930 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:51 compute-0 nova_compute[187243]: 2025-12-03 00:16:51.949 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Updating resource usage from migration 1a09d0f0-d7a1-422a-acd5-e6a0723f1ba4
Dec 03 00:16:51 compute-0 nova_compute[187243]: 2025-12-03 00:16:51.978 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance 72f66e11-43aa-4598-95a3-697bee26b5e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:16:51 compute-0 nova_compute[187243]: 2025-12-03 00:16:51.978 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration 1a09d0f0-d7a1-422a-acd5-e6a0723f1ba4 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 1151, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:16:51 compute-0 nova_compute[187243]: 2025-12-03 00:16:51.979 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:16:51 compute-0 nova_compute[187243]: 2025-12-03 00:16:51.979 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=2814MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:16:50 up  1:25,  0 user,  load average: 0.45, 0.33, 0.30\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_migrating': '1', 'num_os_type_None': '2', 'num_proj_e510a0888b4c4fb5860a0f1720b8ed4b': '2', 'io_workload': '0', 'num_task_None': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:16:52 compute-0 nova_compute[187243]: 2025-12-03 00:16:52.024 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:16:52 compute-0 sshd-session[219298]: Invalid user sipv from 61.220.235.10 port 56478
Dec 03 00:16:52 compute-0 sshd-session[219298]: Received disconnect from 61.220.235.10 port 56478:11: Bye Bye [preauth]
Dec 03 00:16:52 compute-0 sshd-session[219298]: Disconnected from invalid user sipv 61.220.235.10 port 56478 [preauth]
Dec 03 00:16:52 compute-0 nova_compute[187243]: 2025-12-03 00:16:52.530 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:16:53 compute-0 nova_compute[187243]: 2025-12-03 00:16:53.038 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:16:53 compute-0 nova_compute[187243]: 2025-12-03 00:16:53.039 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:53 compute-0 nova_compute[187243]: 2025-12-03 00:16:53.814 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:53 compute-0 nova_compute[187243]: 2025-12-03 00:16:53.965 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:54 compute-0 nova_compute[187243]: 2025-12-03 00:16:54.039 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:16:54 compute-0 nova_compute[187243]: 2025-12-03 00:16:54.039 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:16:54 compute-0 nova_compute[187243]: 2025-12-03 00:16:54.039 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:16:54 compute-0 nova_compute[187243]: 2025-12-03 00:16:54.039 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:16:54 compute-0 nova_compute[187243]: 2025-12-03 00:16:54.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:16:54 compute-0 nova_compute[187243]: 2025-12-03 00:16:54.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:16:58 compute-0 nova_compute[187243]: 2025-12-03 00:16:58.817 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:58 compute-0 nova_compute[187243]: 2025-12-03 00:16:58.967 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:16:59 compute-0 nova_compute[187243]: 2025-12-03 00:16:59.680 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:16:59 compute-0 nova_compute[187243]: 2025-12-03 00:16:59.680 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:16:59 compute-0 nova_compute[187243]: 2025-12-03 00:16:59.681 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c1df5044-c7ad-42e6-93bd-4b5a853ab3b8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:16:59 compute-0 podman[197600]: time="2025-12-03T00:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:16:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:16:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3066 "" "Go-http-client/1.1"
Dec 03 00:17:00 compute-0 nova_compute[187243]: 2025-12-03 00:17:00.193 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:00 compute-0 nova_compute[187243]: 2025-12-03 00:17:00.193 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:00 compute-0 nova_compute[187243]: 2025-12-03 00:17:00.194 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:00 compute-0 nova_compute[187243]: 2025-12-03 00:17:00.194 187247 DEBUG nova.compute.resource_tracker [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:17:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:00.714 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:00.715 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:00.715 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:00 compute-0 nova_compute[187243]: 2025-12-03 00:17:00.947 187247 DEBUG oslo_concurrency.lockutils [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "72f66e11-43aa-4598-95a3-697bee26b5e1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:00 compute-0 nova_compute[187243]: 2025-12-03 00:17:00.948 187247 DEBUG oslo_concurrency.lockutils [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:00 compute-0 nova_compute[187243]: 2025-12-03 00:17:00.948 187247 DEBUG oslo_concurrency.lockutils [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:00 compute-0 nova_compute[187243]: 2025-12-03 00:17:00.948 187247 DEBUG oslo_concurrency.lockutils [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:00 compute-0 nova_compute[187243]: 2025-12-03 00:17:00.948 187247 DEBUG oslo_concurrency.lockutils [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.020 187247 INFO nova.compute.manager [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Terminating instance
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.225 187247 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.290 187247 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.291 187247 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.349 187247 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:17:01 compute-0 openstack_network_exporter[199746]: ERROR   00:17:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:17:01 compute-0 openstack_network_exporter[199746]: ERROR   00:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:17:01 compute-0 openstack_network_exporter[199746]: ERROR   00:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:17:01 compute-0 openstack_network_exporter[199746]: ERROR   00:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:17:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:17:01 compute-0 openstack_network_exporter[199746]: ERROR   00:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:17:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.473 187247 WARNING nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.474 187247 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.492 187247 DEBUG oslo_concurrency.processutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.493 187247 DEBUG nova.compute.resource_tracker [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5611MB free_disk=73.13357925415039GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.493 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.493 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.542 187247 DEBUG nova.compute.manager [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 03 00:17:01 compute-0 kernel: tap7c86d125-a1 (unregistering): left promiscuous mode
Dec 03 00:17:01 compute-0 NetworkManager[55671]: <info>  [1764721021.5762] device (tap7c86d125-a1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:17:01 compute-0 ovn_controller[95488]: 2025-12-03T00:17:01Z|00188|binding|INFO|Releasing lport 7c86d125-a1e0-4d28-bccb-7ba5562c31e9 from this chassis (sb_readonly=0)
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.582 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:01 compute-0 ovn_controller[95488]: 2025-12-03T00:17:01Z|00189|binding|INFO|Setting lport 7c86d125-a1e0-4d28-bccb-7ba5562c31e9 down in Southbound
Dec 03 00:17:01 compute-0 ovn_controller[95488]: 2025-12-03T00:17:01Z|00190|binding|INFO|Removing iface tap7c86d125-a1 ovn-installed in OVS
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.584 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.587 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.592 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:f8:25 10.100.0.11'], port_security=['fa:16:3e:9e:f8:25 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '72f66e11-43aa-4598-95a3-697bee26b5e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e510a0888b4c4fb5860a0f1720b8ed4b', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f1e1fe27-b2d8-445b-bf72-1b1a8b133d14', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f585d4-379a-4fd2-84e7-6b4069bbb279, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=7c86d125-a1e0-4d28-bccb-7ba5562c31e9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.592 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 7c86d125-a1e0-4d28-bccb-7ba5562c31e9 in datapath ee60e03c-ab3a-419f-84ef-62aec4b6b0dd unbound from our chassis
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.593 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.594 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e9d3af0f-b7a1-42c4-8036-0ba81342bcda]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.594 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd namespace which is not needed anymore
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.600 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:01 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000019.scope: Deactivated successfully.
Dec 03 00:17:01 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000019.scope: Consumed 15.471s CPU time.
Dec 03 00:17:01 compute-0 systemd-machined[153518]: Machine qemu-16-instance-00000019 terminated.
Dec 03 00:17:01 compute-0 podman[219311]: 2025-12-03 00:17:01.660789538 +0000 UTC m=+0.043707526 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:17:01 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218865]: [NOTICE]   (218869) : haproxy version is 3.0.5-8e879a5
Dec 03 00:17:01 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218865]: [NOTICE]   (218869) : path to executable is /usr/sbin/haproxy
Dec 03 00:17:01 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218865]: [WARNING]  (218869) : Exiting Master process...
Dec 03 00:17:01 compute-0 podman[219357]: 2025-12-03 00:17:01.695082092 +0000 UTC m=+0.024963258 container kill cbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:17:01 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218865]: [ALERT]    (218869) : Current worker (218871) exited with code 143 (Terminated)
Dec 03 00:17:01 compute-0 neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd[218865]: [WARNING]  (218869) : All workers exited. Exiting... (0)
Dec 03 00:17:01 compute-0 systemd[1]: libpod-cbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137.scope: Deactivated successfully.
Dec 03 00:17:01 compute-0 podman[219374]: 2025-12-03 00:17:01.734567678 +0000 UTC m=+0.022976986 container died cbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.760 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137-userdata-shm.mount: Deactivated successfully.
Dec 03 00:17:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-9bca642642759c127988d41f80f182411e5121a925e3d2446f94f2dc42754092-merged.mount: Deactivated successfully.
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.765 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:01 compute-0 podman[219374]: 2025-12-03 00:17:01.769131269 +0000 UTC m=+0.057540567 container cleanup cbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 03 00:17:01 compute-0 systemd[1]: libpod-conmon-cbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137.scope: Deactivated successfully.
Dec 03 00:17:01 compute-0 podman[219376]: 2025-12-03 00:17:01.786847911 +0000 UTC m=+0.069174444 container remove cbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.793 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[732b698a-1cf8-4770-ac5f-f7c25772ada9]: (4, ("Wed Dec  3 12:17:01 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd (cbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137)\ncbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137\nWed Dec  3 12:17:01 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd (cbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137)\ncbe1141729b7568c68207da2a04ca0065d9f4355eb6f338f98eb5e92d32d1137\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.794 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[abdfbebb-99ac-4a56-a855-6fdd6a832553]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.795 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee60e03c-ab3a-419f-84ef-62aec4b6b0dd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.795 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1ff9c940-3542-4a3d-bcc6-c584d4703b35]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.796 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee60e03c-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:17:01 compute-0 kernel: tapee60e03c-a0: left promiscuous mode
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.798 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.803 187247 INFO nova.virt.libvirt.driver [-] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Instance destroyed successfully.
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.804 187247 DEBUG nova.objects.instance [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lazy-loading 'resources' on Instance uuid 72f66e11-43aa-4598-95a3-697bee26b5e1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.811 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.814 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8109cfe4-59e8-4d5f-839f-ffac112f8f89]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.827 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ec3188-cbbe-4f94-81c2-3dbdce687f51]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.828 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[60ef55ba-fbcb-4957-aebd-a5e17f3657c4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.842 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3b2ac34f-25c8-482c-890a-457c65d131dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503831, 'reachable_time': 15573, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219425, 'error': None, 'target': 'ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.844 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee60e03c-ab3a-419f-84ef-62aec4b6b0dd deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:17:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:01.845 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[b37031f9-4203-4e88-a608-c7cf3426430a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:01 compute-0 systemd[1]: run-netns-ovnmeta\x2dee60e03c\x2dab3a\x2d419f\x2d84ef\x2d62aec4b6b0dd.mount: Deactivated successfully.
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.932 187247 DEBUG nova.compute.manager [req-0be65abf-4242-4249-ae98-8f5417da5ece req-7753fcc2-b236-4fda-9f63-3492abbb9ab2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Received event network-vif-unplugged-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.932 187247 DEBUG oslo_concurrency.lockutils [req-0be65abf-4242-4249-ae98-8f5417da5ece req-7753fcc2-b236-4fda-9f63-3492abbb9ab2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.932 187247 DEBUG oslo_concurrency.lockutils [req-0be65abf-4242-4249-ae98-8f5417da5ece req-7753fcc2-b236-4fda-9f63-3492abbb9ab2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.933 187247 DEBUG oslo_concurrency.lockutils [req-0be65abf-4242-4249-ae98-8f5417da5ece req-7753fcc2-b236-4fda-9f63-3492abbb9ab2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.933 187247 DEBUG nova.compute.manager [req-0be65abf-4242-4249-ae98-8f5417da5ece req-7753fcc2-b236-4fda-9f63-3492abbb9ab2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] No waiting events found dispatching network-vif-unplugged-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:17:01 compute-0 nova_compute[187243]: 2025-12-03 00:17:01.933 187247 DEBUG nova.compute.manager [req-0be65abf-4242-4249-ae98-8f5417da5ece req-7753fcc2-b236-4fda-9f63-3492abbb9ab2 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Received event network-vif-unplugged-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.310 187247 DEBUG nova.virt.libvirt.vif [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:15:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1081576432',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1081576432',id=25,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:16:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e510a0888b4c4fb5860a0f1720b8ed4b',ramdisk_id='',reservation_id='r-9bdy2tss',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1290727110-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:16:10Z,user_data=None,user_id='0473307cd38b412cbfdbd093053eb1af',uuid=72f66e11-43aa-4598-95a3-697bee26b5e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "address": "fa:16:3e:9e:f8:25", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c86d125-a1", "ovs_interfaceid": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.311 187247 DEBUG nova.network.os_vif_util [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converting VIF {"id": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "address": "fa:16:3e:9e:f8:25", "network": {"id": "ee60e03c-ab3a-419f-84ef-62aec4b6b0dd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-780198702-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1299554f9c3e4ee7a7991ca25c47f7c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c86d125-a1", "ovs_interfaceid": "7c86d125-a1e0-4d28-bccb-7ba5562c31e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.312 187247 DEBUG nova.network.os_vif_util [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:f8:25,bridge_name='br-int',has_traffic_filtering=True,id=7c86d125-a1e0-4d28-bccb-7ba5562c31e9,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c86d125-a1') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.312 187247 DEBUG os_vif [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:f8:25,bridge_name='br-int',has_traffic_filtering=True,id=7c86d125-a1e0-4d28-bccb-7ba5562c31e9,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c86d125-a1') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.315 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.316 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c86d125-a1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.317 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.318 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.319 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.320 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=19e3bebe-28a5-46a6-8ada-8e54d4976047) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.321 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.321 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.323 187247 INFO os_vif [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:f8:25,bridge_name='br-int',has_traffic_filtering=True,id=7c86d125-a1e0-4d28-bccb-7ba5562c31e9,network=Network(ee60e03c-ab3a-419f-84ef-62aec4b6b0dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c86d125-a1')
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.324 187247 INFO nova.virt.libvirt.driver [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Deleting instance files /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1_del
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.324 187247 INFO nova.virt.libvirt.driver [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Deletion of /var/lib/nova/instances/72f66e11-43aa-4598-95a3-697bee26b5e1_del complete
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.510 187247 DEBUG nova.compute.resource_tracker [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance c1df5044-c7ad-42e6-93bd-4b5a853ab3b8 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.836 187247 INFO nova.compute.manager [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Took 1.29 seconds to destroy the instance on the hypervisor.
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.836 187247 DEBUG oslo.service.backend._eventlet.loopingcall [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.837 187247 DEBUG nova.compute.manager [-] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.837 187247 DEBUG nova.network.neutron [-] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 03 00:17:02 compute-0 nova_compute[187243]: 2025-12-03 00:17:02.838 187247 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.019 187247 DEBUG nova.compute.resource_tracker [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.052 187247 DEBUG nova.compute.resource_tracker [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Instance 72f66e11-43aa-4598-95a3-697bee26b5e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.052 187247 DEBUG nova.compute.resource_tracker [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration 1a09d0f0-d7a1-422a-acd5-e6a0723f1ba4 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 1151, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.052 187247 DEBUG nova.compute.resource_tracker [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.052 187247 DEBUG nova.compute.resource_tracker [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1663MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:17:01 up  1:25,  0 user,  load average: 0.38, 0.32, 0.30\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_e510a0888b4c4fb5860a0f1720b8ed4b': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.071 187247 DEBUG nova.scheduler.client.report [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Refreshing inventories for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.088 187247 DEBUG nova.scheduler.client.report [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Updating ProviderTree inventory for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.089 187247 DEBUG nova.compute.provider_tree [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.101 187247 DEBUG nova.scheduler.client.report [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Refreshing aggregate associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.115 187247 DEBUG nova.scheduler.client.report [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Refreshing trait associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_SOUND_MODEL_AC97,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOUND_MODEL_ICH9,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.158 187247 DEBUG nova.compute.provider_tree [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.671 187247 DEBUG nova.scheduler.client.report [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.840 187247 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.968 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.986 187247 DEBUG nova.compute.manager [req-c971d673-c77e-4ab8-a627-1c437fd650b7 req-d5500c9c-7e44-4c8f-ba38-278ace877b35 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Received event network-vif-unplugged-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.986 187247 DEBUG oslo_concurrency.lockutils [req-c971d673-c77e-4ab8-a627-1c437fd650b7 req-d5500c9c-7e44-4c8f-ba38-278ace877b35 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.986 187247 DEBUG oslo_concurrency.lockutils [req-c971d673-c77e-4ab8-a627-1c437fd650b7 req-d5500c9c-7e44-4c8f-ba38-278ace877b35 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.987 187247 DEBUG oslo_concurrency.lockutils [req-c971d673-c77e-4ab8-a627-1c437fd650b7 req-d5500c9c-7e44-4c8f-ba38-278ace877b35 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.987 187247 DEBUG nova.compute.manager [req-c971d673-c77e-4ab8-a627-1c437fd650b7 req-d5500c9c-7e44-4c8f-ba38-278ace877b35 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] No waiting events found dispatching network-vif-unplugged-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:17:03 compute-0 nova_compute[187243]: 2025-12-03 00:17:03.987 187247 DEBUG nova.compute.manager [req-c971d673-c77e-4ab8-a627-1c437fd650b7 req-d5500c9c-7e44-4c8f-ba38-278ace877b35 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Received event network-vif-unplugged-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:17:04 compute-0 nova_compute[187243]: 2025-12-03 00:17:04.184 187247 DEBUG nova.compute.resource_tracker [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:17:04 compute-0 nova_compute[187243]: 2025-12-03 00:17:04.184 187247 DEBUG oslo_concurrency.lockutils [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.691s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:04 compute-0 nova_compute[187243]: 2025-12-03 00:17:04.220 187247 INFO nova.compute.manager [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 03 00:17:04 compute-0 nova_compute[187243]: 2025-12-03 00:17:04.966 187247 DEBUG nova.network.neutron [-] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:17:05 compute-0 podman[219426]: 2025-12-03 00:17:05.112774114 +0000 UTC m=+0.058104092 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Dec 03 00:17:05 compute-0 podman[219427]: 2025-12-03 00:17:05.161601049 +0000 UTC m=+0.099868707 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:17:05 compute-0 nova_compute[187243]: 2025-12-03 00:17:05.294 187247 INFO nova.scheduler.client.report [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration 1a09d0f0-d7a1-422a-acd5-e6a0723f1ba4
Dec 03 00:17:05 compute-0 nova_compute[187243]: 2025-12-03 00:17:05.295 187247 DEBUG nova.virt.libvirt.driver [None req-5537e9d5-0824-4d42-959c-a27eeaea9927 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c1df5044-c7ad-42e6-93bd-4b5a853ab3b8] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:17:05 compute-0 nova_compute[187243]: 2025-12-03 00:17:05.474 187247 INFO nova.compute.manager [-] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Took 2.64 seconds to deallocate network for instance.
Dec 03 00:17:05 compute-0 nova_compute[187243]: 2025-12-03 00:17:05.998 187247 DEBUG oslo_concurrency.lockutils [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:05 compute-0 nova_compute[187243]: 2025-12-03 00:17:05.998 187247 DEBUG oslo_concurrency.lockutils [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:06 compute-0 nova_compute[187243]: 2025-12-03 00:17:06.043 187247 DEBUG nova.compute.provider_tree [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:17:06 compute-0 nova_compute[187243]: 2025-12-03 00:17:06.085 187247 DEBUG nova.compute.manager [req-33cbbe5f-d1fb-4de9-a0dd-8f78c033b991 req-f2df1b65-1547-4fd1-b995-26d78f4d3cdd 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 72f66e11-43aa-4598-95a3-697bee26b5e1] Received event network-vif-deleted-7c86d125-a1e0-4d28-bccb-7ba5562c31e9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:17:06 compute-0 nova_compute[187243]: 2025-12-03 00:17:06.554 187247 DEBUG nova.scheduler.client.report [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:17:07 compute-0 nova_compute[187243]: 2025-12-03 00:17:07.065 187247 DEBUG oslo_concurrency.lockutils [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.067s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:07 compute-0 nova_compute[187243]: 2025-12-03 00:17:07.091 187247 INFO nova.scheduler.client.report [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Deleted allocations for instance 72f66e11-43aa-4598-95a3-697bee26b5e1
Dec 03 00:17:07 compute-0 nova_compute[187243]: 2025-12-03 00:17:07.322 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:08 compute-0 nova_compute[187243]: 2025-12-03 00:17:08.136 187247 DEBUG oslo_concurrency.lockutils [None req-420356d8-9443-4135-840f-fbc9e1c24585 0473307cd38b412cbfdbd093053eb1af e510a0888b4c4fb5860a0f1720b8ed4b - - default default] Lock "72f66e11-43aa-4598-95a3-697bee26b5e1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.188s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:08 compute-0 nova_compute[187243]: 2025-12-03 00:17:08.970 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:09 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:09.970 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:17:09 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:09.971 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:17:09 compute-0 nova_compute[187243]: 2025-12-03 00:17:09.972 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:12 compute-0 nova_compute[187243]: 2025-12-03 00:17:12.324 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:13 compute-0 nova_compute[187243]: 2025-12-03 00:17:13.971 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:15 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:15.973 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:17:17 compute-0 nova_compute[187243]: 2025-12-03 00:17:17.325 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:18 compute-0 nova_compute[187243]: 2025-12-03 00:17:18.972 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:19 compute-0 nova_compute[187243]: 2025-12-03 00:17:19.277 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:20 compute-0 podman[219475]: 2025-12-03 00:17:20.098959345 +0000 UTC m=+0.062099904 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 03 00:17:21 compute-0 podman[219497]: 2025-12-03 00:17:21.124700303 +0000 UTC m=+0.077890377 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:17:22 compute-0 nova_compute[187243]: 2025-12-03 00:17:22.375 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:23 compute-0 nova_compute[187243]: 2025-12-03 00:17:23.974 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:27 compute-0 nova_compute[187243]: 2025-12-03 00:17:27.419 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:28 compute-0 nova_compute[187243]: 2025-12-03 00:17:28.976 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:29 compute-0 podman[197600]: time="2025-12-03T00:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:17:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:17:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2609 "" "Go-http-client/1.1"
Dec 03 00:17:31 compute-0 openstack_network_exporter[199746]: ERROR   00:17:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:17:31 compute-0 openstack_network_exporter[199746]: ERROR   00:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:17:31 compute-0 openstack_network_exporter[199746]: ERROR   00:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:17:31 compute-0 openstack_network_exporter[199746]: ERROR   00:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:17:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:17:31 compute-0 openstack_network_exporter[199746]: ERROR   00:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:17:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:17:31 compute-0 sshd-session[219517]: Invalid user syncthing from 102.210.148.92 port 54214
Dec 03 00:17:31 compute-0 podman[219519]: 2025-12-03 00:17:31.846384725 +0000 UTC m=+0.054467779 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:17:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:31.958 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:70:96:25 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3da51bfd7f1c491b839f6b6b49056c8b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=636cd919-869d-4a8a-92fa-ec7c18804da5) old=Port_Binding(mac=['fa:16:3e:70:96:25'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3da51bfd7f1c491b839f6b6b49056c8b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:17:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:31.959 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 636cd919-869d-4a8a-92fa-ec7c18804da5 in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 updated
Dec 03 00:17:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:31.960 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:17:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:31.961 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[88579fe2-9d3e-4d16-a891-c336cb55b7ed]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:32 compute-0 sshd-session[219517]: Received disconnect from 102.210.148.92 port 54214:11: Bye Bye [preauth]
Dec 03 00:17:32 compute-0 sshd-session[219517]: Disconnected from invalid user syncthing 102.210.148.92 port 54214 [preauth]
Dec 03 00:17:32 compute-0 nova_compute[187243]: 2025-12-03 00:17:32.421 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:33 compute-0 nova_compute[187243]: 2025-12-03 00:17:33.979 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:36 compute-0 podman[219543]: 2025-12-03 00:17:36.111758426 +0000 UTC m=+0.062484204 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 03 00:17:36 compute-0 podman[219544]: 2025-12-03 00:17:36.128058452 +0000 UTC m=+0.082159856 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4)
Dec 03 00:17:37 compute-0 nova_compute[187243]: 2025-12-03 00:17:37.463 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:38 compute-0 nova_compute[187243]: 2025-12-03 00:17:38.980 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:41 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:41.222 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:6b:b4 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-296b09f4-618a-4795-9eb9-f83709052164', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-296b09f4-618a-4795-9eb9-f83709052164', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6792b8fb-c596-438f-8f0f-cceaba427dae, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=45d438f6-104b-45ed-8931-ccdd86402201) old=Port_Binding(mac=['fa:16:3e:a0:6b:b4'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-296b09f4-618a-4795-9eb9-f83709052164', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-296b09f4-618a-4795-9eb9-f83709052164', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:17:41 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:41.224 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 45d438f6-104b-45ed-8931-ccdd86402201 in datapath 296b09f4-618a-4795-9eb9-f83709052164 updated
Dec 03 00:17:41 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:41.225 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 296b09f4-618a-4795-9eb9-f83709052164, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:17:41 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:17:41.226 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[031ba6a0-f8c7-4528-897a-033f03f73d83]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:17:42 compute-0 nova_compute[187243]: 2025-12-03 00:17:42.466 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:43 compute-0 sshd-session[219589]: Received disconnect from 45.78.219.213 port 35514:11: Bye Bye [preauth]
Dec 03 00:17:43 compute-0 sshd-session[219589]: Disconnected from authenticating user root 45.78.219.213 port 35514 [preauth]
Dec 03 00:17:43 compute-0 nova_compute[187243]: 2025-12-03 00:17:43.981 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:45 compute-0 nova_compute[187243]: 2025-12-03 00:17:45.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:46 compute-0 sshd-session[219592]: Invalid user khan from 20.123.120.169 port 58962
Dec 03 00:17:46 compute-0 sshd-session[219592]: Received disconnect from 20.123.120.169 port 58962:11: Bye Bye [preauth]
Dec 03 00:17:46 compute-0 sshd-session[219592]: Disconnected from invalid user khan 20.123.120.169 port 58962 [preauth]
Dec 03 00:17:46 compute-0 nova_compute[187243]: 2025-12-03 00:17:46.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:46 compute-0 nova_compute[187243]: 2025-12-03 00:17:46.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:47 compute-0 nova_compute[187243]: 2025-12-03 00:17:47.496 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:48 compute-0 nova_compute[187243]: 2025-12-03 00:17:48.982 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:50 compute-0 sshd-session[219594]: Received disconnect from 23.95.37.90 port 59502:11: Bye Bye [preauth]
Dec 03 00:17:50 compute-0 sshd-session[219594]: Disconnected from authenticating user root 23.95.37.90 port 59502 [preauth]
Dec 03 00:17:50 compute-0 nova_compute[187243]: 2025-12-03 00:17:50.392 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:50 compute-0 nova_compute[187243]: 2025-12-03 00:17:50.392 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:50 compute-0 nova_compute[187243]: 2025-12-03 00:17:50.896 187247 DEBUG nova.compute.manager [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:17:51 compute-0 podman[219596]: 2025-12-03 00:17:51.097933534 +0000 UTC m=+0.055617709 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Dec 03 00:17:51 compute-0 nova_compute[187243]: 2025-12-03 00:17:51.101 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:51 compute-0 nova_compute[187243]: 2025-12-03 00:17:51.101 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:51 compute-0 nova_compute[187243]: 2025-12-03 00:17:51.101 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:17:51 compute-0 nova_compute[187243]: 2025-12-03 00:17:51.101 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:51 compute-0 nova_compute[187243]: 2025-12-03 00:17:51.441 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:51 compute-0 nova_compute[187243]: 2025-12-03 00:17:51.441 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:51 compute-0 nova_compute[187243]: 2025-12-03 00:17:51.447 187247 DEBUG nova.virt.hardware [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:17:51 compute-0 nova_compute[187243]: 2025-12-03 00:17:51.447 187247 INFO nova.compute.claims [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:17:51 compute-0 nova_compute[187243]: 2025-12-03 00:17:51.617 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:52 compute-0 podman[219617]: 2025-12-03 00:17:52.095257597 +0000 UTC m=+0.053512485 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 03 00:17:52 compute-0 ovn_controller[95488]: 2025-12-03T00:17:52Z|00191|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 03 00:17:52 compute-0 nova_compute[187243]: 2025-12-03 00:17:52.498 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:52 compute-0 nova_compute[187243]: 2025-12-03 00:17:52.502 187247 DEBUG nova.compute.provider_tree [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:17:53 compute-0 nova_compute[187243]: 2025-12-03 00:17:53.008 187247 DEBUG nova.scheduler.client.report [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:17:53 compute-0 nova_compute[187243]: 2025-12-03 00:17:53.524 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.083s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:53 compute-0 nova_compute[187243]: 2025-12-03 00:17:53.525 187247 DEBUG nova.compute.manager [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:17:53 compute-0 nova_compute[187243]: 2025-12-03 00:17:53.527 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 1.909s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:53 compute-0 nova_compute[187243]: 2025-12-03 00:17:53.527 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:53 compute-0 nova_compute[187243]: 2025-12-03 00:17:53.527 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:17:53 compute-0 nova_compute[187243]: 2025-12-03 00:17:53.679 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:17:53 compute-0 nova_compute[187243]: 2025-12-03 00:17:53.681 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:17:53 compute-0 nova_compute[187243]: 2025-12-03 00:17:53.701 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:17:53 compute-0 nova_compute[187243]: 2025-12-03 00:17:53.702 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5836MB free_disk=73.16227340698242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:17:53 compute-0 nova_compute[187243]: 2025-12-03 00:17:53.702 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:53 compute-0 nova_compute[187243]: 2025-12-03 00:17:53.702 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:53 compute-0 nova_compute[187243]: 2025-12-03 00:17:53.984 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:54 compute-0 nova_compute[187243]: 2025-12-03 00:17:54.034 187247 DEBUG nova.compute.manager [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:17:54 compute-0 nova_compute[187243]: 2025-12-03 00:17:54.035 187247 DEBUG nova.network.neutron [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:17:54 compute-0 nova_compute[187243]: 2025-12-03 00:17:54.035 187247 WARNING neutronclient.v2_0.client [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:17:54 compute-0 nova_compute[187243]: 2025-12-03 00:17:54.036 187247 WARNING neutronclient.v2_0.client [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:17:54 compute-0 nova_compute[187243]: 2025-12-03 00:17:54.542 187247 INFO nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:17:54 compute-0 nova_compute[187243]: 2025-12-03 00:17:54.580 187247 DEBUG nova.network.neutron [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Successfully created port: c926feac-0f5a-4138-a74f-f066c3bf5f80 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:17:54 compute-0 nova_compute[187243]: 2025-12-03 00:17:54.738 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance c9a442a2-b67f-45a9-a7b3-2f866d137327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:17:54 compute-0 nova_compute[187243]: 2025-12-03 00:17:54.738 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:17:54 compute-0 nova_compute[187243]: 2025-12-03 00:17:54.739 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:17:53 up  1:26,  0 user,  load average: 0.16, 0.27, 0.28\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_networking': '1', 'num_os_type_None': '1', 'num_proj_e363b47741a1476ca7e5987b6d15acb5': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:17:54 compute-0 nova_compute[187243]: 2025-12-03 00:17:54.786 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:17:55 compute-0 nova_compute[187243]: 2025-12-03 00:17:55.055 187247 DEBUG nova.compute.manager [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:17:55 compute-0 nova_compute[187243]: 2025-12-03 00:17:55.296 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:17:55 compute-0 nova_compute[187243]: 2025-12-03 00:17:55.805 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:17:55 compute-0 nova_compute[187243]: 2025-12-03 00:17:55.806 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:55 compute-0 nova_compute[187243]: 2025-12-03 00:17:55.807 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:55 compute-0 nova_compute[187243]: 2025-12-03 00:17:55.807 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:17:55 compute-0 nova_compute[187243]: 2025-12-03 00:17:55.916 187247 DEBUG nova.network.neutron [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Successfully updated port: c926feac-0f5a-4138-a74f-f066c3bf5f80 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:17:55 compute-0 nova_compute[187243]: 2025-12-03 00:17:55.992 187247 DEBUG nova.compute.manager [req-cbad44a5-5e13-4712-808a-027ecdc4cfcc req-ba91b998-3879-42e8-a8b4-c8f08bc11467 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-changed-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:17:55 compute-0 nova_compute[187243]: 2025-12-03 00:17:55.992 187247 DEBUG nova.compute.manager [req-cbad44a5-5e13-4712-808a-027ecdc4cfcc req-ba91b998-3879-42e8-a8b4-c8f08bc11467 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Refreshing instance network info cache due to event network-changed-c926feac-0f5a-4138-a74f-f066c3bf5f80. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:17:55 compute-0 nova_compute[187243]: 2025-12-03 00:17:55.993 187247 DEBUG oslo_concurrency.lockutils [req-cbad44a5-5e13-4712-808a-027ecdc4cfcc req-ba91b998-3879-42e8-a8b4-c8f08bc11467 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:17:55 compute-0 nova_compute[187243]: 2025-12-03 00:17:55.993 187247 DEBUG oslo_concurrency.lockutils [req-cbad44a5-5e13-4712-808a-027ecdc4cfcc req-ba91b998-3879-42e8-a8b4-c8f08bc11467 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:17:55 compute-0 nova_compute[187243]: 2025-12-03 00:17:55.994 187247 DEBUG nova.network.neutron [req-cbad44a5-5e13-4712-808a-027ecdc4cfcc req-ba91b998-3879-42e8-a8b4-c8f08bc11467 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Refreshing network info cache for port c926feac-0f5a-4138-a74f-f066c3bf5f80 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.071 187247 DEBUG nova.compute.manager [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.072 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.073 187247 INFO nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Creating image(s)
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.073 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.074 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.074 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.075 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.079 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.080 187247 DEBUG oslo_concurrency.processutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.135 187247 DEBUG oslo_concurrency.processutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.136 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.136 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.137 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.139 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.140 187247 DEBUG oslo_concurrency.processutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.189 187247 DEBUG oslo_concurrency.processutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.190 187247 DEBUG oslo_concurrency.processutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.223 187247 DEBUG oslo_concurrency.processutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.224 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.225 187247 DEBUG oslo_concurrency.processutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.271 187247 DEBUG oslo_concurrency.processutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.272 187247 DEBUG nova.virt.disk.api [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Checking if we can resize image /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.273 187247 DEBUG oslo_concurrency.processutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.344 187247 DEBUG oslo_concurrency.processutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.344 187247 DEBUG nova.virt.disk.api [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Cannot resize image /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.345 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.345 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Ensure instance console log exists: /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.345 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.346 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.346 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.421 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.499 187247 WARNING neutronclient.v2_0.client [req-cbad44a5-5e13-4712-808a-027ecdc4cfcc req-ba91b998-3879-42e8-a8b4-c8f08bc11467 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:17:56 compute-0 nova_compute[187243]: 2025-12-03 00:17:56.899 187247 DEBUG nova.network.neutron [req-cbad44a5-5e13-4712-808a-027ecdc4cfcc req-ba91b998-3879-42e8-a8b4-c8f08bc11467 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:17:57 compute-0 nova_compute[187243]: 2025-12-03 00:17:57.105 187247 DEBUG nova.network.neutron [req-cbad44a5-5e13-4712-808a-027ecdc4cfcc req-ba91b998-3879-42e8-a8b4-c8f08bc11467 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:17:57 compute-0 nova_compute[187243]: 2025-12-03 00:17:57.500 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:57 compute-0 nova_compute[187243]: 2025-12-03 00:17:57.612 187247 DEBUG oslo_concurrency.lockutils [req-cbad44a5-5e13-4712-808a-027ecdc4cfcc req-ba91b998-3879-42e8-a8b4-c8f08bc11467 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:17:57 compute-0 nova_compute[187243]: 2025-12-03 00:17:57.613 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquired lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:17:57 compute-0 nova_compute[187243]: 2025-12-03 00:17:57.613 187247 DEBUG nova.network.neutron [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:17:58 compute-0 nova_compute[187243]: 2025-12-03 00:17:58.101 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:58 compute-0 nova_compute[187243]: 2025-12-03 00:17:58.101 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:58 compute-0 nova_compute[187243]: 2025-12-03 00:17:58.102 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:58 compute-0 nova_compute[187243]: 2025-12-03 00:17:58.102 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:17:58 compute-0 nova_compute[187243]: 2025-12-03 00:17:58.103 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:17:58 compute-0 sshd-session[219652]: Invalid user user8 from 49.247.36.49 port 34107
Dec 03 00:17:58 compute-0 nova_compute[187243]: 2025-12-03 00:17:58.610 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:17:58 compute-0 sshd-session[219652]: Received disconnect from 49.247.36.49 port 34107:11: Bye Bye [preauth]
Dec 03 00:17:58 compute-0 sshd-session[219652]: Disconnected from invalid user user8 49.247.36.49 port 34107 [preauth]
Dec 03 00:17:58 compute-0 nova_compute[187243]: 2025-12-03 00:17:58.816 187247 DEBUG nova.network.neutron [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:17:58 compute-0 nova_compute[187243]: 2025-12-03 00:17:58.984 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:17:59 compute-0 nova_compute[187243]: 2025-12-03 00:17:59.077 187247 WARNING neutronclient.v2_0.client [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:17:59 compute-0 podman[197600]: time="2025-12-03T00:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:17:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:17:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec 03 00:17:59 compute-0 nova_compute[187243]: 2025-12-03 00:17:59.888 187247 DEBUG nova.network.neutron [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Updating instance_info_cache with network_info: [{"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.396 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Releasing lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.396 187247 DEBUG nova.compute.manager [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Instance network_info: |[{"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.400 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Start _get_guest_xml network_info=[{"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.404 187247 WARNING nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.406 187247 DEBUG nova.virt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1409069009', uuid='c9a442a2-b67f-45a9-a7b3-2f866d137327'), owner=OwnerMeta(userid='db24d5b25a924602ae8a7dc539bc6cbf', username='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin', projectid='e363b47741a1476ca7e5987b6d15acb5', projectname='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764721080.4066808) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.412 187247 DEBUG nova.virt.libvirt.host [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.412 187247 DEBUG nova.virt.libvirt.host [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.418 187247 DEBUG nova.virt.libvirt.host [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.418 187247 DEBUG nova.virt.libvirt.host [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.419 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.420 187247 DEBUG nova.virt.hardware [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.420 187247 DEBUG nova.virt.hardware [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.420 187247 DEBUG nova.virt.hardware [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.420 187247 DEBUG nova.virt.hardware [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.421 187247 DEBUG nova.virt.hardware [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.421 187247 DEBUG nova.virt.hardware [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.421 187247 DEBUG nova.virt.hardware [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.421 187247 DEBUG nova.virt.hardware [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.421 187247 DEBUG nova.virt.hardware [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.422 187247 DEBUG nova.virt.hardware [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.422 187247 DEBUG nova.virt.hardware [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.425 187247 DEBUG nova.virt.libvirt.vif [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:17:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1409069009',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1409069',id=26,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-82faa1lu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:17:55Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=c9a442a2-b67f-45a9-a7b3-2f866d137327,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.425 187247 DEBUG nova.network.os_vif_util [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converting VIF {"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.426 187247 DEBUG nova.network.os_vif_util [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.426 187247 DEBUG nova.objects.instance [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lazy-loading 'pci_devices' on Instance uuid c9a442a2-b67f-45a9-a7b3-2f866d137327 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:18:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:00.716 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:00.716 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:00.716 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.934 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:18:00 compute-0 nova_compute[187243]:   <uuid>c9a442a2-b67f-45a9-a7b3-2f866d137327</uuid>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   <name>instance-0000001a</name>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-1409069009</nova:name>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:18:00</nova:creationTime>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:18:00 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:18:00 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         <nova:port uuid="c926feac-0f5a-4138-a74f-f066c3bf5f80">
Dec 03 00:18:00 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <system>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <entry name="serial">c9a442a2-b67f-45a9-a7b3-2f866d137327</entry>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <entry name="uuid">c9a442a2-b67f-45a9-a7b3-2f866d137327</entry>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     </system>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   <os>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   </os>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   <features>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   </features>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk.config"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:91:4e:2a"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <target dev="tapc926feac-0f"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/console.log" append="off"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <video>
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     </video>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:18:00 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:18:00 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:18:00 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:18:00 compute-0 nova_compute[187243]: </domain>
Dec 03 00:18:00 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.936 187247 DEBUG nova.compute.manager [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Preparing to wait for external event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.936 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.937 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.937 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.938 187247 DEBUG nova.virt.libvirt.vif [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:17:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1409069009',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1409069',id=26,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-82faa1lu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:17:55Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=c9a442a2-b67f-45a9-a7b3-2f866d137327,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.938 187247 DEBUG nova.network.os_vif_util [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converting VIF {"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.939 187247 DEBUG nova.network.os_vif_util [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.939 187247 DEBUG os_vif [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.940 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.940 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.941 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.942 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.942 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '0600d7a0-8dfb-590e-af55-2c595eba2741', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.980 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.981 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.984 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.984 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc926feac-0f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.985 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc926feac-0f, col_values=(('qos', UUID('1805a310-30d9-4e18-9fd8-c6d18514748d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.985 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc926feac-0f, col_values=(('external_ids', {'iface-id': 'c926feac-0f5a-4138-a74f-f066c3bf5f80', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:91:4e:2a', 'vm-uuid': 'c9a442a2-b67f-45a9-a7b3-2f866d137327'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.986 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:00 compute-0 NetworkManager[55671]: <info>  [1764721080.9878] manager: (tapc926feac-0f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.988 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.991 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:00 compute-0 nova_compute[187243]: 2025-12-03 00:18:00.992 187247 INFO os_vif [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f')
Dec 03 00:18:01 compute-0 openstack_network_exporter[199746]: ERROR   00:18:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:18:01 compute-0 openstack_network_exporter[199746]: ERROR   00:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:18:01 compute-0 openstack_network_exporter[199746]: ERROR   00:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:18:01 compute-0 openstack_network_exporter[199746]: ERROR   00:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:18:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:18:01 compute-0 openstack_network_exporter[199746]: ERROR   00:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:18:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:18:01 compute-0 anacron[7485]: Job `cron.monthly' started
Dec 03 00:18:01 compute-0 anacron[7485]: Job `cron.monthly' terminated
Dec 03 00:18:01 compute-0 anacron[7485]: Normal exit (3 jobs run)
Dec 03 00:18:02 compute-0 podman[219659]: 2025-12-03 00:18:02.105513434 +0000 UTC m=+0.058229705 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:18:02 compute-0 nova_compute[187243]: 2025-12-03 00:18:02.535 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:18:02 compute-0 nova_compute[187243]: 2025-12-03 00:18:02.535 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:18:02 compute-0 nova_compute[187243]: 2025-12-03 00:18:02.536 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] No VIF found with MAC fa:16:3e:91:4e:2a, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:18:02 compute-0 nova_compute[187243]: 2025-12-03 00:18:02.536 187247 INFO nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Using config drive
Dec 03 00:18:03 compute-0 nova_compute[187243]: 2025-12-03 00:18:03.047 187247 WARNING neutronclient.v2_0.client [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:18:03 compute-0 nova_compute[187243]: 2025-12-03 00:18:03.988 187247 INFO nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Creating config drive at /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk.config
Dec 03 00:18:03 compute-0 nova_compute[187243]: 2025-12-03 00:18:03.997 187247 DEBUG oslo_concurrency.processutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmprbvn10hl execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.009 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.129 187247 DEBUG oslo_concurrency.processutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmprbvn10hl" returned: 0 in 0.132s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:18:04 compute-0 kernel: tapc926feac-0f: entered promiscuous mode
Dec 03 00:18:04 compute-0 NetworkManager[55671]: <info>  [1764721084.1994] manager: (tapc926feac-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/77)
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.199 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:04 compute-0 ovn_controller[95488]: 2025-12-03T00:18:04Z|00192|binding|INFO|Claiming lport c926feac-0f5a-4138-a74f-f066c3bf5f80 for this chassis.
Dec 03 00:18:04 compute-0 ovn_controller[95488]: 2025-12-03T00:18:04Z|00193|binding|INFO|c926feac-0f5a-4138-a74f-f066c3bf5f80: Claiming fa:16:3e:91:4e:2a 10.100.0.9
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.202 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.214 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:4e:2a 10.100.0.9'], port_security=['fa:16:3e:91:4e:2a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c9a442a2-b67f-45a9-a7b3-2f866d137327', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=c926feac-0f5a-4138-a74f-f066c3bf5f80) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.214 104379 INFO neutron.agent.ovn.metadata.agent [-] Port c926feac-0f5a-4138-a74f-f066c3bf5f80 in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 bound to our chassis
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.216 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.228 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3241a6f4-021a-4a68-8388-536c117372ee]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.229 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf7ff943d-e1 in ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:18:04 compute-0 systemd-udevd[219702]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.231 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf7ff943d-e0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.231 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6b6226bc-89d8-40e6-abbf-21420fa5469e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.232 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[977eafab-6659-403b-9aba-52cc5eb02ff0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 systemd-machined[153518]: New machine qemu-17-instance-0000001a.
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.246 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[d612ac08-36f1-47e0-bad3-86e7ee66085f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 NetworkManager[55671]: <info>  [1764721084.2491] device (tapc926feac-0f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:18:04 compute-0 NetworkManager[55671]: <info>  [1764721084.2518] device (tapc926feac-0f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.254 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:04 compute-0 ovn_controller[95488]: 2025-12-03T00:18:04Z|00194|binding|INFO|Setting lport c926feac-0f5a-4138-a74f-f066c3bf5f80 ovn-installed in OVS
Dec 03 00:18:04 compute-0 ovn_controller[95488]: 2025-12-03T00:18:04Z|00195|binding|INFO|Setting lport c926feac-0f5a-4138-a74f-f066c3bf5f80 up in Southbound
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.260 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.263 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[eeef5c84-9f31-4370-a26e-3229eafca048]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-0000001a.
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.290 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[aa651d4e-37ed-4ad6-9516-552be32a55c1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.295 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[079e948e-3552-4799-99f3-1040d8c8f7d9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 systemd-udevd[219708]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:18:04 compute-0 NetworkManager[55671]: <info>  [1764721084.2963] manager: (tapf7ff943d-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/78)
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.331 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef08401-64cc-4684-8488-4e5bec630a2d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.334 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[1109daa7-ce14-4df0-abb4-589d010c561b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 NetworkManager[55671]: <info>  [1764721084.3592] device (tapf7ff943d-e0): carrier: link connected
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.365 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[7eb66372-ffad-4aca-84c8-33116bd7c742]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.386 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[989a3914-de0e-4071-93c6-4e7035f95d14]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517500, 'reachable_time': 28846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219736, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.404 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ea49b1cc-10c7-4cdf-a418-d748bf427762]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:9625'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517500, 'tstamp': 517500}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219737, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.420 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ac15ea00-39d9-4ac8-a8b8-cba57f093469]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517500, 'reachable_time': 28846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219738, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.449 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad93194-18c2-4be5-a45b-c2aa2cadaa57]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.525 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6011e047-1955-43b9-9c54-4d5945e69c60]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.527 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.527 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.527 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ff943d-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:04 compute-0 NetworkManager[55671]: <info>  [1764721084.5299] manager: (tapf7ff943d-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Dec 03 00:18:04 compute-0 kernel: tapf7ff943d-e0: entered promiscuous mode
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.529 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.531 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7ff943d-e0, col_values=(('external_ids', {'iface-id': '636cd919-869d-4a8a-92fa-ec7c18804da5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:04 compute-0 ovn_controller[95488]: 2025-12-03T00:18:04Z|00196|binding|INFO|Releasing lport 636cd919-869d-4a8a-92fa-ec7c18804da5 from this chassis (sb_readonly=0)
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.532 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.533 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.534 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5978be85-61a9-4035-a1cf-dd888114652c]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.535 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.535 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.535 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.536 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.536 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[987faef5-4245-47db-8b93-73adef3393e1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.536 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.537 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8322d5be-6526-42a9-a5d4-4dc55a60c2c9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.537 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: global
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: defaults
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     log global
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:18:04 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:04.538 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'env', 'PROCESS_TAG=haproxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.547 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.680 187247 DEBUG nova.compute.manager [req-9fe6db11-102d-4a44-88f1-26f1ada5b9ca req-6417e417-a9fa-4c80-892d-28cbd1966fa1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.681 187247 DEBUG oslo_concurrency.lockutils [req-9fe6db11-102d-4a44-88f1-26f1ada5b9ca req-6417e417-a9fa-4c80-892d-28cbd1966fa1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.681 187247 DEBUG oslo_concurrency.lockutils [req-9fe6db11-102d-4a44-88f1-26f1ada5b9ca req-6417e417-a9fa-4c80-892d-28cbd1966fa1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.681 187247 DEBUG oslo_concurrency.lockutils [req-9fe6db11-102d-4a44-88f1-26f1ada5b9ca req-6417e417-a9fa-4c80-892d-28cbd1966fa1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.681 187247 DEBUG nova.compute.manager [req-9fe6db11-102d-4a44-88f1-26f1ada5b9ca req-6417e417-a9fa-4c80-892d-28cbd1966fa1 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Processing event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.951 187247 DEBUG nova.compute.manager [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.955 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.958 187247 INFO nova.virt.libvirt.driver [-] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Instance spawned successfully.
Dec 03 00:18:04 compute-0 nova_compute[187243]: 2025-12-03 00:18:04.959 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:18:04 compute-0 podman[219770]: 2025-12-03 00:18:04.871911574 +0000 UTC m=+0.023802877 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:18:04 compute-0 podman[219770]: 2025-12-03 00:18:04.966919656 +0000 UTC m=+0.118810959 container create 9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Dec 03 00:18:05 compute-0 systemd[1]: Started libpod-conmon-9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2.scope.
Dec 03 00:18:05 compute-0 systemd[1]: Started libcrun container.
Dec 03 00:18:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cef287c549175ae16bcd4efac9406ba74739717802e4a4fd7bbbf1ee200c6b92/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:18:05 compute-0 podman[219770]: 2025-12-03 00:18:05.066150076 +0000 UTC m=+0.218041389 container init 9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Dec 03 00:18:05 compute-0 podman[219770]: 2025-12-03 00:18:05.071126473 +0000 UTC m=+0.223017756 container start 9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 03 00:18:05 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219792]: [NOTICE]   (219796) : New worker (219798) forked
Dec 03 00:18:05 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219792]: [NOTICE]   (219796) : Loading success.
Dec 03 00:18:05 compute-0 nova_compute[187243]: 2025-12-03 00:18:05.472 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:18:05 compute-0 nova_compute[187243]: 2025-12-03 00:18:05.473 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:18:05 compute-0 nova_compute[187243]: 2025-12-03 00:18:05.473 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:18:05 compute-0 nova_compute[187243]: 2025-12-03 00:18:05.474 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:18:05 compute-0 nova_compute[187243]: 2025-12-03 00:18:05.474 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:18:05 compute-0 nova_compute[187243]: 2025-12-03 00:18:05.474 187247 DEBUG nova.virt.libvirt.driver [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:18:05 compute-0 sshd-session[219683]: Invalid user bodega from 101.47.140.127 port 55498
Dec 03 00:18:05 compute-0 nova_compute[187243]: 2025-12-03 00:18:05.989 187247 INFO nova.compute.manager [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Took 9.92 seconds to spawn the instance on the hypervisor.
Dec 03 00:18:05 compute-0 nova_compute[187243]: 2025-12-03 00:18:05.990 187247 DEBUG nova.compute.manager [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:18:06 compute-0 nova_compute[187243]: 2025-12-03 00:18:06.021 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:06 compute-0 sshd-session[219683]: Received disconnect from 101.47.140.127 port 55498:11: Bye Bye [preauth]
Dec 03 00:18:06 compute-0 sshd-session[219683]: Disconnected from invalid user bodega 101.47.140.127 port 55498 [preauth]
Dec 03 00:18:06 compute-0 nova_compute[187243]: 2025-12-03 00:18:06.552 187247 INFO nova.compute.manager [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Took 15.15 seconds to build instance.
Dec 03 00:18:06 compute-0 nova_compute[187243]: 2025-12-03 00:18:06.782 187247 DEBUG nova.compute.manager [req-be0bc1e7-c5b3-4976-99a0-c9ed8c6ef7f0 req-238ff075-5849-41e1-98f7-a0afb2c42322 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:18:06 compute-0 nova_compute[187243]: 2025-12-03 00:18:06.782 187247 DEBUG oslo_concurrency.lockutils [req-be0bc1e7-c5b3-4976-99a0-c9ed8c6ef7f0 req-238ff075-5849-41e1-98f7-a0afb2c42322 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:06 compute-0 nova_compute[187243]: 2025-12-03 00:18:06.783 187247 DEBUG oslo_concurrency.lockutils [req-be0bc1e7-c5b3-4976-99a0-c9ed8c6ef7f0 req-238ff075-5849-41e1-98f7-a0afb2c42322 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:06 compute-0 nova_compute[187243]: 2025-12-03 00:18:06.783 187247 DEBUG oslo_concurrency.lockutils [req-be0bc1e7-c5b3-4976-99a0-c9ed8c6ef7f0 req-238ff075-5849-41e1-98f7-a0afb2c42322 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:06 compute-0 nova_compute[187243]: 2025-12-03 00:18:06.783 187247 DEBUG nova.compute.manager [req-be0bc1e7-c5b3-4976-99a0-c9ed8c6ef7f0 req-238ff075-5849-41e1-98f7-a0afb2c42322 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] No waiting events found dispatching network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:18:06 compute-0 nova_compute[187243]: 2025-12-03 00:18:06.783 187247 WARNING nova.compute.manager [req-be0bc1e7-c5b3-4976-99a0-c9ed8c6ef7f0 req-238ff075-5849-41e1-98f7-a0afb2c42322 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received unexpected event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 for instance with vm_state active and task_state None.
Dec 03 00:18:07 compute-0 nova_compute[187243]: 2025-12-03 00:18:07.062 187247 DEBUG oslo_concurrency.lockutils [None req-c8a01f7d-df5b-4e70-a999-7bca9b9812d2 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.670s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:07 compute-0 podman[219807]: 2025-12-03 00:18:07.157352604 +0000 UTC m=+0.108363213 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 03 00:18:07 compute-0 podman[219808]: 2025-12-03 00:18:07.171632398 +0000 UTC m=+0.121719104 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible)
Dec 03 00:18:08 compute-0 nova_compute[187243]: 2025-12-03 00:18:08.989 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:11 compute-0 nova_compute[187243]: 2025-12-03 00:18:11.024 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:13 compute-0 nova_compute[187243]: 2025-12-03 00:18:13.991 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:14 compute-0 nova_compute[187243]: 2025-12-03 00:18:14.001 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:14 compute-0 nova_compute[187243]: 2025-12-03 00:18:14.002 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:14 compute-0 nova_compute[187243]: 2025-12-03 00:18:14.507 187247 DEBUG nova.compute.manager [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:18:15 compute-0 nova_compute[187243]: 2025-12-03 00:18:15.069 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:15 compute-0 nova_compute[187243]: 2025-12-03 00:18:15.070 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:15 compute-0 nova_compute[187243]: 2025-12-03 00:18:15.080 187247 DEBUG nova.virt.hardware [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:18:15 compute-0 nova_compute[187243]: 2025-12-03 00:18:15.081 187247 INFO nova.compute.claims [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:18:16 compute-0 nova_compute[187243]: 2025-12-03 00:18:16.026 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:16 compute-0 nova_compute[187243]: 2025-12-03 00:18:16.158 187247 DEBUG nova.compute.provider_tree [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:18:16 compute-0 ovn_controller[95488]: 2025-12-03T00:18:16Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:91:4e:2a 10.100.0.9
Dec 03 00:18:16 compute-0 ovn_controller[95488]: 2025-12-03T00:18:16Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:91:4e:2a 10.100.0.9
Dec 03 00:18:17 compute-0 nova_compute[187243]: 2025-12-03 00:18:17.862 187247 DEBUG nova.scheduler.client.report [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:18:18 compute-0 nova_compute[187243]: 2025-12-03 00:18:18.457 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.388s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:18 compute-0 nova_compute[187243]: 2025-12-03 00:18:18.458 187247 DEBUG nova.compute.manager [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:18:18 compute-0 nova_compute[187243]: 2025-12-03 00:18:18.993 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:19 compute-0 nova_compute[187243]: 2025-12-03 00:18:19.059 187247 DEBUG nova.compute.manager [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:18:19 compute-0 nova_compute[187243]: 2025-12-03 00:18:19.059 187247 DEBUG nova.network.neutron [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:18:19 compute-0 nova_compute[187243]: 2025-12-03 00:18:19.060 187247 WARNING neutronclient.v2_0.client [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:18:19 compute-0 nova_compute[187243]: 2025-12-03 00:18:19.060 187247 WARNING neutronclient.v2_0.client [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:18:19 compute-0 nova_compute[187243]: 2025-12-03 00:18:19.567 187247 INFO nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:18:20 compute-0 nova_compute[187243]: 2025-12-03 00:18:20.115 187247 DEBUG nova.compute.manager [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.057 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.366 187247 DEBUG nova.compute.manager [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.368 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.368 187247 INFO nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Creating image(s)
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.369 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.369 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.369 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.370 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.373 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.374 187247 DEBUG oslo_concurrency.processutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.427 187247 DEBUG oslo_concurrency.processutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.427 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.428 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.428 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.432 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.432 187247 DEBUG oslo_concurrency.processutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.482 187247 DEBUG oslo_concurrency.processutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.484 187247 DEBUG oslo_concurrency.processutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.525 187247 DEBUG oslo_concurrency.processutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.527 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.528 187247 DEBUG oslo_concurrency.processutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.583 187247 DEBUG oslo_concurrency.processutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.584 187247 DEBUG nova.virt.disk.api [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Checking if we can resize image /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.585 187247 DEBUG oslo_concurrency.processutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.634 187247 DEBUG oslo_concurrency.processutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.635 187247 DEBUG nova.virt.disk.api [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Cannot resize image /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.636 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.636 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Ensure instance console log exists: /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.636 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.637 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:21 compute-0 nova_compute[187243]: 2025-12-03 00:18:21.637 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:22 compute-0 nova_compute[187243]: 2025-12-03 00:18:22.020 187247 DEBUG nova.network.neutron [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Successfully created port: 8b0adcad-4e57-4150-b6d7-890ceb893e2e _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:18:22 compute-0 podman[219877]: 2025-12-03 00:18:22.112621035 +0000 UTC m=+0.062683689 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 03 00:18:22 compute-0 podman[219898]: 2025-12-03 00:18:22.197346965 +0000 UTC m=+0.054520871 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:18:22 compute-0 nova_compute[187243]: 2025-12-03 00:18:22.945 187247 DEBUG nova.network.neutron [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Successfully updated port: 8b0adcad-4e57-4150-b6d7-890ceb893e2e _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:18:23 compute-0 nova_compute[187243]: 2025-12-03 00:18:23.229 187247 DEBUG nova.compute.manager [req-31aadcdb-9f69-4a2e-882c-b2fecff0d565 req-d750317e-c325-459c-aacc-9ad143ae0686 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-changed-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:18:23 compute-0 nova_compute[187243]: 2025-12-03 00:18:23.230 187247 DEBUG nova.compute.manager [req-31aadcdb-9f69-4a2e-882c-b2fecff0d565 req-d750317e-c325-459c-aacc-9ad143ae0686 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Refreshing instance network info cache due to event network-changed-8b0adcad-4e57-4150-b6d7-890ceb893e2e. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:18:23 compute-0 nova_compute[187243]: 2025-12-03 00:18:23.230 187247 DEBUG oslo_concurrency.lockutils [req-31aadcdb-9f69-4a2e-882c-b2fecff0d565 req-d750317e-c325-459c-aacc-9ad143ae0686 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:18:23 compute-0 nova_compute[187243]: 2025-12-03 00:18:23.230 187247 DEBUG oslo_concurrency.lockutils [req-31aadcdb-9f69-4a2e-882c-b2fecff0d565 req-d750317e-c325-459c-aacc-9ad143ae0686 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:18:23 compute-0 nova_compute[187243]: 2025-12-03 00:18:23.230 187247 DEBUG nova.network.neutron [req-31aadcdb-9f69-4a2e-882c-b2fecff0d565 req-d750317e-c325-459c-aacc-9ad143ae0686 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Refreshing network info cache for port 8b0adcad-4e57-4150-b6d7-890ceb893e2e _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:18:23 compute-0 nova_compute[187243]: 2025-12-03 00:18:23.450 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:18:23 compute-0 nova_compute[187243]: 2025-12-03 00:18:23.738 187247 WARNING neutronclient.v2_0.client [req-31aadcdb-9f69-4a2e-882c-b2fecff0d565 req-d750317e-c325-459c-aacc-9ad143ae0686 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:18:23 compute-0 nova_compute[187243]: 2025-12-03 00:18:23.896 187247 DEBUG nova.network.neutron [req-31aadcdb-9f69-4a2e-882c-b2fecff0d565 req-d750317e-c325-459c-aacc-9ad143ae0686 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:18:23 compute-0 nova_compute[187243]: 2025-12-03 00:18:23.994 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:24 compute-0 nova_compute[187243]: 2025-12-03 00:18:24.058 187247 DEBUG nova.network.neutron [req-31aadcdb-9f69-4a2e-882c-b2fecff0d565 req-d750317e-c325-459c-aacc-9ad143ae0686 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:18:24 compute-0 nova_compute[187243]: 2025-12-03 00:18:24.564 187247 DEBUG oslo_concurrency.lockutils [req-31aadcdb-9f69-4a2e-882c-b2fecff0d565 req-d750317e-c325-459c-aacc-9ad143ae0686 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:18:24 compute-0 nova_compute[187243]: 2025-12-03 00:18:24.564 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquired lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:18:24 compute-0 nova_compute[187243]: 2025-12-03 00:18:24.564 187247 DEBUG nova.network.neutron [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:18:25 compute-0 sshd-session[219919]: Received disconnect from 61.220.235.10 port 55656:11: Bye Bye [preauth]
Dec 03 00:18:25 compute-0 sshd-session[219919]: Disconnected from authenticating user root 61.220.235.10 port 55656 [preauth]
Dec 03 00:18:25 compute-0 nova_compute[187243]: 2025-12-03 00:18:25.888 187247 DEBUG nova.network.neutron [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:18:26 compute-0 nova_compute[187243]: 2025-12-03 00:18:26.098 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:26 compute-0 nova_compute[187243]: 2025-12-03 00:18:26.875 187247 WARNING neutronclient.v2_0.client [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.068 187247 DEBUG nova.network.neutron [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Updating instance_info_cache with network_info: [{"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.575 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Releasing lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.575 187247 DEBUG nova.compute.manager [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Instance network_info: |[{"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.578 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Start _get_guest_xml network_info=[{"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.582 187247 WARNING nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.583 187247 DEBUG nova.virt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadStabilizationStrategy-server-104589744', uuid='2e3ecd0e-4de1-44c9-805b-8d695da6b95e'), owner=OwnerMeta(userid='db24d5b25a924602ae8a7dc539bc6cbf', username='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin', projectid='e363b47741a1476ca7e5987b6d15acb5', projectname='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764721107.583526) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.590 187247 DEBUG nova.virt.libvirt.host [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.590 187247 DEBUG nova.virt.libvirt.host [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.594 187247 DEBUG nova.virt.libvirt.host [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.594 187247 DEBUG nova.virt.libvirt.host [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.595 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.596 187247 DEBUG nova.virt.hardware [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.596 187247 DEBUG nova.virt.hardware [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.596 187247 DEBUG nova.virt.hardware [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.596 187247 DEBUG nova.virt.hardware [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.596 187247 DEBUG nova.virt.hardware [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.596 187247 DEBUG nova.virt.hardware [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.597 187247 DEBUG nova.virt.hardware [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.597 187247 DEBUG nova.virt.hardware [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.597 187247 DEBUG nova.virt.hardware [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.597 187247 DEBUG nova.virt.hardware [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.597 187247 DEBUG nova.virt.hardware [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.601 187247 DEBUG nova.virt.libvirt.vif [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:18:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-104589744',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1045897',id=27,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-78r0m2qy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:18:20Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=2e3ecd0e-4de1-44c9-805b-8d695da6b95e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.601 187247 DEBUG nova.network.os_vif_util [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converting VIF {"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.601 187247 DEBUG nova.network.os_vif_util [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:18:27 compute-0 nova_compute[187243]: 2025-12-03 00:18:27.602 187247 DEBUG nova.objects.instance [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2e3ecd0e-4de1-44c9-805b-8d695da6b95e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.110 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:18:28 compute-0 nova_compute[187243]:   <uuid>2e3ecd0e-4de1-44c9-805b-8d695da6b95e</uuid>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   <name>instance-0000001b</name>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-104589744</nova:name>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:18:27</nova:creationTime>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:18:28 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:18:28 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         <nova:port uuid="8b0adcad-4e57-4150-b6d7-890ceb893e2e">
Dec 03 00:18:28 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <system>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <entry name="serial">2e3ecd0e-4de1-44c9-805b-8d695da6b95e</entry>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <entry name="uuid">2e3ecd0e-4de1-44c9-805b-8d695da6b95e</entry>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     </system>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   <os>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   </os>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   <features>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   </features>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk.config"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:9d:ff:2c"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <target dev="tap8b0adcad-4e"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/console.log" append="off"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <video>
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     </video>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:18:28 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:18:28 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:18:28 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:18:28 compute-0 nova_compute[187243]: </domain>
Dec 03 00:18:28 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.111 187247 DEBUG nova.compute.manager [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Preparing to wait for external event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.111 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.111 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.111 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.112 187247 DEBUG nova.virt.libvirt.vif [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:18:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-104589744',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1045897',id=27,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-78r0m2qy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:18:20Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=2e3ecd0e-4de1-44c9-805b-8d695da6b95e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.113 187247 DEBUG nova.network.os_vif_util [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converting VIF {"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.113 187247 DEBUG nova.network.os_vif_util [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.114 187247 DEBUG os_vif [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.114 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.115 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.115 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.116 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.116 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9f5fee56-14ae-5e8c-9c1b-69c88a6bd4cb', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.117 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.118 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.120 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.121 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b0adcad-4e, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.121 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap8b0adcad-4e, col_values=(('qos', UUID('87f5ac60-fb7e-4bce-883d-d28d23c59851')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.121 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap8b0adcad-4e, col_values=(('external_ids', {'iface-id': '8b0adcad-4e57-4150-b6d7-890ceb893e2e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9d:ff:2c', 'vm-uuid': '2e3ecd0e-4de1-44c9-805b-8d695da6b95e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.122 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:28 compute-0 NetworkManager[55671]: <info>  [1764721108.1235] manager: (tap8b0adcad-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.124 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.128 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.129 187247 INFO os_vif [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e')
Dec 03 00:18:28 compute-0 nova_compute[187243]: 2025-12-03 00:18:28.997 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:29 compute-0 nova_compute[187243]: 2025-12-03 00:18:29.675 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:18:29 compute-0 nova_compute[187243]: 2025-12-03 00:18:29.676 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:18:29 compute-0 nova_compute[187243]: 2025-12-03 00:18:29.677 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] No VIF found with MAC fa:16:3e:9d:ff:2c, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:18:29 compute-0 nova_compute[187243]: 2025-12-03 00:18:29.677 187247 INFO nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Using config drive
Dec 03 00:18:29 compute-0 podman[197600]: time="2025-12-03T00:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:18:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:18:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3068 "" "Go-http-client/1.1"
Dec 03 00:18:30 compute-0 nova_compute[187243]: 2025-12-03 00:18:30.191 187247 WARNING neutronclient.v2_0.client [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:18:30 compute-0 nova_compute[187243]: 2025-12-03 00:18:30.987 187247 INFO nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Creating config drive at /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk.config
Dec 03 00:18:30 compute-0 nova_compute[187243]: 2025-12-03 00:18:30.992 187247 DEBUG oslo_concurrency.processutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpavzbtrlr execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:18:31 compute-0 nova_compute[187243]: 2025-12-03 00:18:31.115 187247 DEBUG oslo_concurrency.processutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpavzbtrlr" returned: 0 in 0.123s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:18:31 compute-0 kernel: tap8b0adcad-4e: entered promiscuous mode
Dec 03 00:18:31 compute-0 NetworkManager[55671]: <info>  [1764721111.1622] manager: (tap8b0adcad-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Dec 03 00:18:31 compute-0 nova_compute[187243]: 2025-12-03 00:18:31.196 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:31 compute-0 ovn_controller[95488]: 2025-12-03T00:18:31Z|00197|binding|INFO|Claiming lport 8b0adcad-4e57-4150-b6d7-890ceb893e2e for this chassis.
Dec 03 00:18:31 compute-0 ovn_controller[95488]: 2025-12-03T00:18:31Z|00198|binding|INFO|8b0adcad-4e57-4150-b6d7-890ceb893e2e: Claiming fa:16:3e:9d:ff:2c 10.100.0.10
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.208 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:ff:2c 10.100.0.10'], port_security=['fa:16:3e:9d:ff:2c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2e3ecd0e-4de1-44c9-805b-8d695da6b95e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=8b0adcad-4e57-4150-b6d7-890ceb893e2e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.209 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 8b0adcad-4e57-4150-b6d7-890ceb893e2e in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 bound to our chassis
Dec 03 00:18:31 compute-0 ovn_controller[95488]: 2025-12-03T00:18:31Z|00199|binding|INFO|Setting lport 8b0adcad-4e57-4150-b6d7-890ceb893e2e ovn-installed in OVS
Dec 03 00:18:31 compute-0 ovn_controller[95488]: 2025-12-03T00:18:31Z|00200|binding|INFO|Setting lport 8b0adcad-4e57-4150-b6d7-890ceb893e2e up in Southbound
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.211 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:18:31 compute-0 nova_compute[187243]: 2025-12-03 00:18:31.212 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:31 compute-0 systemd-udevd[219943]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:18:31 compute-0 systemd-machined[153518]: New machine qemu-18-instance-0000001b.
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.228 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[28abdfa4-ab7f-411f-9320-f065f0d34527]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:31 compute-0 NetworkManager[55671]: <info>  [1764721111.2339] device (tap8b0adcad-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:18:31 compute-0 NetworkManager[55671]: <info>  [1764721111.2347] device (tap8b0adcad-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:18:31 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-0000001b.
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.255 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[b775b5a2-5871-4a09-98a5-44f8a6260852]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.257 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[5f796c54-e4f0-474e-b306-d89c1e2a300a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.286 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[eb0db3db-cce6-4de0-8d90-2c6da7820d9e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.303 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9075e0a8-d590-4c31-83a5-9b982fcc8475]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517500, 'reachable_time': 28846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219955, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.318 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[010e6d43-57e6-425c-8226-a5c6a9683850]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517511, 'tstamp': 517511}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219957, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517516, 'tstamp': 517516}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219957, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.319 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:31 compute-0 nova_compute[187243]: 2025-12-03 00:18:31.321 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:31 compute-0 nova_compute[187243]: 2025-12-03 00:18:31.322 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.322 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ff943d-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.322 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.323 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7ff943d-e0, col_values=(('external_ids', {'iface-id': '636cd919-869d-4a8a-92fa-ec7c18804da5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.323 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:18:31 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:31.324 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[84ef1aee-7f90-4dc3-98e5-bcc46a2f6cb3]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:18:31 compute-0 openstack_network_exporter[199746]: ERROR   00:18:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:18:31 compute-0 openstack_network_exporter[199746]: ERROR   00:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:18:31 compute-0 openstack_network_exporter[199746]: ERROR   00:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:18:31 compute-0 openstack_network_exporter[199746]: ERROR   00:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:18:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:18:31 compute-0 openstack_network_exporter[199746]: ERROR   00:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:18:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.001 187247 DEBUG nova.compute.manager [req-afbaccb1-1cec-43c3-b860-352ac586fa4b req-0d5f1185-41c6-4617-a50a-f59552538e28 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.002 187247 DEBUG oslo_concurrency.lockutils [req-afbaccb1-1cec-43c3-b860-352ac586fa4b req-0d5f1185-41c6-4617-a50a-f59552538e28 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.002 187247 DEBUG oslo_concurrency.lockutils [req-afbaccb1-1cec-43c3-b860-352ac586fa4b req-0d5f1185-41c6-4617-a50a-f59552538e28 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.002 187247 DEBUG oslo_concurrency.lockutils [req-afbaccb1-1cec-43c3-b860-352ac586fa4b req-0d5f1185-41c6-4617-a50a-f59552538e28 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.002 187247 DEBUG nova.compute.manager [req-afbaccb1-1cec-43c3-b860-352ac586fa4b req-0d5f1185-41c6-4617-a50a-f59552538e28 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Processing event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.003 187247 DEBUG nova.compute.manager [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.006 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.009 187247 INFO nova.virt.libvirt.driver [-] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Instance spawned successfully.
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.009 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:18:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:32.059 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:18:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:32.059 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.059 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.523 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.524 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.524 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.524 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.525 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:18:32 compute-0 nova_compute[187243]: 2025-12-03 00:18:32.525 187247 DEBUG nova.virt.libvirt.driver [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:18:33 compute-0 nova_compute[187243]: 2025-12-03 00:18:33.034 187247 INFO nova.compute.manager [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Took 11.67 seconds to spawn the instance on the hypervisor.
Dec 03 00:18:33 compute-0 nova_compute[187243]: 2025-12-03 00:18:33.035 187247 DEBUG nova.compute.manager [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:18:33 compute-0 podman[219968]: 2025-12-03 00:18:33.090299783 +0000 UTC m=+0.048571449 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:18:33 compute-0 nova_compute[187243]: 2025-12-03 00:18:33.122 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:33 compute-0 sshd-session[219966]: Invalid user admin from 80.94.95.116 port 25984
Dec 03 00:18:33 compute-0 nova_compute[187243]: 2025-12-03 00:18:33.567 187247 INFO nova.compute.manager [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Took 18.55 seconds to build instance.
Dec 03 00:18:33 compute-0 sshd-session[219966]: Connection closed by invalid user admin 80.94.95.116 port 25984 [preauth]
Dec 03 00:18:34 compute-0 nova_compute[187243]: 2025-12-03 00:18:34.000 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:34 compute-0 nova_compute[187243]: 2025-12-03 00:18:34.069 187247 DEBUG nova.compute.manager [req-f42b7056-fe2b-43f0-944d-5767b2fa8845 req-934612f5-b66f-4e61-8f49-275ba3d3b66b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:18:34 compute-0 nova_compute[187243]: 2025-12-03 00:18:34.070 187247 DEBUG oslo_concurrency.lockutils [req-f42b7056-fe2b-43f0-944d-5767b2fa8845 req-934612f5-b66f-4e61-8f49-275ba3d3b66b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:34 compute-0 nova_compute[187243]: 2025-12-03 00:18:34.070 187247 DEBUG oslo_concurrency.lockutils [req-f42b7056-fe2b-43f0-944d-5767b2fa8845 req-934612f5-b66f-4e61-8f49-275ba3d3b66b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:34 compute-0 nova_compute[187243]: 2025-12-03 00:18:34.070 187247 DEBUG oslo_concurrency.lockutils [req-f42b7056-fe2b-43f0-944d-5767b2fa8845 req-934612f5-b66f-4e61-8f49-275ba3d3b66b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:34 compute-0 nova_compute[187243]: 2025-12-03 00:18:34.070 187247 DEBUG nova.compute.manager [req-f42b7056-fe2b-43f0-944d-5767b2fa8845 req-934612f5-b66f-4e61-8f49-275ba3d3b66b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] No waiting events found dispatching network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:18:34 compute-0 nova_compute[187243]: 2025-12-03 00:18:34.070 187247 WARNING nova.compute.manager [req-f42b7056-fe2b-43f0-944d-5767b2fa8845 req-934612f5-b66f-4e61-8f49-275ba3d3b66b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received unexpected event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e for instance with vm_state active and task_state None.
Dec 03 00:18:34 compute-0 nova_compute[187243]: 2025-12-03 00:18:34.072 187247 DEBUG oslo_concurrency.lockutils [None req-fea5ab14-4abc-4820-902d-70a671d32cf8 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.070s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:38 compute-0 podman[219992]: 2025-12-03 00:18:38.099367501 +0000 UTC m=+0.052672864 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 03 00:18:38 compute-0 nova_compute[187243]: 2025-12-03 00:18:38.123 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:38 compute-0 podman[219993]: 2025-12-03 00:18:38.126727128 +0000 UTC m=+0.077432974 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Dec 03 00:18:39 compute-0 nova_compute[187243]: 2025-12-03 00:18:39.002 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:42 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:18:42.061 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:18:43 compute-0 nova_compute[187243]: 2025-12-03 00:18:43.125 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:44 compute-0 nova_compute[187243]: 2025-12-03 00:18:44.004 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:45 compute-0 ovn_controller[95488]: 2025-12-03T00:18:45Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9d:ff:2c 10.100.0.10
Dec 03 00:18:45 compute-0 ovn_controller[95488]: 2025-12-03T00:18:45Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9d:ff:2c 10.100.0.10
Dec 03 00:18:46 compute-0 nova_compute[187243]: 2025-12-03 00:18:46.100 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:48 compute-0 nova_compute[187243]: 2025-12-03 00:18:48.128 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:48 compute-0 nova_compute[187243]: 2025-12-03 00:18:48.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:49 compute-0 nova_compute[187243]: 2025-12-03 00:18:49.007 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:49 compute-0 sshd-session[220055]: Invalid user sales1 from 102.210.148.92 port 34534
Dec 03 00:18:49 compute-0 sshd-session[220055]: Received disconnect from 102.210.148.92 port 34534:11: Bye Bye [preauth]
Dec 03 00:18:49 compute-0 sshd-session[220055]: Disconnected from invalid user sales1 102.210.148.92 port 34534 [preauth]
Dec 03 00:18:51 compute-0 nova_compute[187243]: 2025-12-03 00:18:51.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:51 compute-0 nova_compute[187243]: 2025-12-03 00:18:51.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:18:51 compute-0 nova_compute[187243]: 2025-12-03 00:18:51.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:52 compute-0 nova_compute[187243]: 2025-12-03 00:18:52.121 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:52 compute-0 nova_compute[187243]: 2025-12-03 00:18:52.122 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:52 compute-0 nova_compute[187243]: 2025-12-03 00:18:52.122 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:52 compute-0 nova_compute[187243]: 2025-12-03 00:18:52.122 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:18:52 compute-0 podman[220058]: 2025-12-03 00:18:52.226378678 +0000 UTC m=+0.058516702 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git)
Dec 03 00:18:52 compute-0 podman[220079]: 2025-12-03 00:18:52.315639294 +0000 UTC m=+0.060401301 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.130 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.169 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.223 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.223 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.275 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.281 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.333 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.334 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.385 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.513 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.514 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.533 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.534 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5510MB free_disk=73.10489654541016GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.534 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:18:53 compute-0 nova_compute[187243]: 2025-12-03 00:18:53.535 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:18:54 compute-0 nova_compute[187243]: 2025-12-03 00:18:54.009 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:54 compute-0 nova_compute[187243]: 2025-12-03 00:18:54.730 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance c9a442a2-b67f-45a9-a7b3-2f866d137327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:18:54 compute-0 nova_compute[187243]: 2025-12-03 00:18:54.730 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance 2e3ecd0e-4de1-44c9-805b-8d695da6b95e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:18:54 compute-0 nova_compute[187243]: 2025-12-03 00:18:54.731 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:18:54 compute-0 nova_compute[187243]: 2025-12-03 00:18:54.731 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:18:53 up  1:27,  0 user,  load average: 0.42, 0.32, 0.30\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_e363b47741a1476ca7e5987b6d15acb5': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:18:54 compute-0 nova_compute[187243]: 2025-12-03 00:18:54.849 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:18:55 compute-0 nova_compute[187243]: 2025-12-03 00:18:55.472 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:18:56 compute-0 nova_compute[187243]: 2025-12-03 00:18:56.475 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:18:56 compute-0 nova_compute[187243]: 2025-12-03 00:18:56.475 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.941s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:18:57 compute-0 nova_compute[187243]: 2025-12-03 00:18:57.476 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:57 compute-0 nova_compute[187243]: 2025-12-03 00:18:57.476 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:57 compute-0 nova_compute[187243]: 2025-12-03 00:18:57.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:58 compute-0 nova_compute[187243]: 2025-12-03 00:18:58.132 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:58 compute-0 nova_compute[187243]: 2025-12-03 00:18:58.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:18:59 compute-0 nova_compute[187243]: 2025-12-03 00:18:59.011 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:18:59 compute-0 podman[197600]: time="2025-12-03T00:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:18:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:18:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3067 "" "Go-http-client/1.1"
Dec 03 00:19:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:00.717 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:00.718 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:00.718 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:01 compute-0 ovn_controller[95488]: 2025-12-03T00:19:01Z|00201|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Dec 03 00:19:01 compute-0 openstack_network_exporter[199746]: ERROR   00:19:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:19:01 compute-0 openstack_network_exporter[199746]: ERROR   00:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:19:01 compute-0 openstack_network_exporter[199746]: ERROR   00:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:19:01 compute-0 openstack_network_exporter[199746]: ERROR   00:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:19:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:19:01 compute-0 openstack_network_exporter[199746]: ERROR   00:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:19:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:19:03 compute-0 nova_compute[187243]: 2025-12-03 00:19:03.133 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:04 compute-0 nova_compute[187243]: 2025-12-03 00:19:04.014 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:04 compute-0 podman[220111]: 2025-12-03 00:19:04.132114732 +0000 UTC m=+0.090119219 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:19:04 compute-0 nova_compute[187243]: 2025-12-03 00:19:04.511 187247 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Check if temp file /var/lib/nova/instances/tmpazvfqycv exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:19:04 compute-0 nova_compute[187243]: 2025-12-03 00:19:04.512 187247 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Check if temp file /var/lib/nova/instances/tmpi5zwnlar exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:19:04 compute-0 nova_compute[187243]: 2025-12-03 00:19:04.517 187247 DEBUG nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpazvfqycv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e3ecd0e-4de1-44c9-805b-8d695da6b95e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:19:04 compute-0 nova_compute[187243]: 2025-12-03 00:19:04.519 187247 DEBUG nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi5zwnlar',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c9a442a2-b67f-45a9-a7b3-2f866d137327',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:19:05 compute-0 nova_compute[187243]: 2025-12-03 00:19:05.587 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:08 compute-0 nova_compute[187243]: 2025-12-03 00:19:08.136 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:09 compute-0 nova_compute[187243]: 2025-12-03 00:19:09.014 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:09 compute-0 podman[220136]: 2025-12-03 00:19:09.09645106 +0000 UTC m=+0.050608302 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 03 00:19:09 compute-0 podman[220137]: 2025-12-03 00:19:09.130345244 +0000 UTC m=+0.082476754 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:19:09 compute-0 nova_compute[187243]: 2025-12-03 00:19:09.210 187247 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:09 compute-0 nova_compute[187243]: 2025-12-03 00:19:09.263 187247 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:09 compute-0 nova_compute[187243]: 2025-12-03 00:19:09.264 187247 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:09 compute-0 nova_compute[187243]: 2025-12-03 00:19:09.316 187247 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:09 compute-0 nova_compute[187243]: 2025-12-03 00:19:09.317 187247 DEBUG nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Preparing to wait for external event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:19:09 compute-0 nova_compute[187243]: 2025-12-03 00:19:09.318 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:09 compute-0 nova_compute[187243]: 2025-12-03 00:19:09.318 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:09 compute-0 nova_compute[187243]: 2025-12-03 00:19:09.318 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:13 compute-0 nova_compute[187243]: 2025-12-03 00:19:13.138 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:14 compute-0 nova_compute[187243]: 2025-12-03 00:19:14.016 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:16 compute-0 nova_compute[187243]: 2025-12-03 00:19:16.248 187247 DEBUG nova.compute.manager [req-e9fd3ada-fc55-4603-9578-4eefa1fe55c0 req-ec82f268-80cd-43b8-b9d0-dcc326d7d485 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:16 compute-0 nova_compute[187243]: 2025-12-03 00:19:16.248 187247 DEBUG oslo_concurrency.lockutils [req-e9fd3ada-fc55-4603-9578-4eefa1fe55c0 req-ec82f268-80cd-43b8-b9d0-dcc326d7d485 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:16 compute-0 nova_compute[187243]: 2025-12-03 00:19:16.249 187247 DEBUG oslo_concurrency.lockutils [req-e9fd3ada-fc55-4603-9578-4eefa1fe55c0 req-ec82f268-80cd-43b8-b9d0-dcc326d7d485 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:16 compute-0 nova_compute[187243]: 2025-12-03 00:19:16.249 187247 DEBUG oslo_concurrency.lockutils [req-e9fd3ada-fc55-4603-9578-4eefa1fe55c0 req-ec82f268-80cd-43b8-b9d0-dcc326d7d485 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:16 compute-0 nova_compute[187243]: 2025-12-03 00:19:16.249 187247 DEBUG nova.compute.manager [req-e9fd3ada-fc55-4603-9578-4eefa1fe55c0 req-ec82f268-80cd-43b8-b9d0-dcc326d7d485 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] No event matching network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 in dict_keys([('network-vif-plugged', 'c926feac-0f5a-4138-a74f-f066c3bf5f80')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:19:16 compute-0 nova_compute[187243]: 2025-12-03 00:19:16.249 187247 DEBUG nova.compute.manager [req-e9fd3ada-fc55-4603-9578-4eefa1fe55c0 req-ec82f268-80cd-43b8-b9d0-dcc326d7d485 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.021 187247 INFO nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Took 8.70 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.139 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:18 compute-0 sshd-session[220186]: Invalid user admin1 from 23.95.37.90 port 57058
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.351 187247 DEBUG nova.compute.manager [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.352 187247 DEBUG oslo_concurrency.lockutils [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.352 187247 DEBUG oslo_concurrency.lockutils [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.352 187247 DEBUG oslo_concurrency.lockutils [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.353 187247 DEBUG nova.compute.manager [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Processing event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.353 187247 DEBUG nova.compute.manager [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-changed-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.353 187247 DEBUG nova.compute.manager [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Refreshing instance network info cache due to event network-changed-c926feac-0f5a-4138-a74f-f066c3bf5f80. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.353 187247 DEBUG oslo_concurrency.lockutils [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.353 187247 DEBUG oslo_concurrency.lockutils [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.354 187247 DEBUG nova.network.neutron [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Refreshing network info cache for port c926feac-0f5a-4138-a74f-f066c3bf5f80 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.354 187247 DEBUG nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:19:18 compute-0 sshd-session[220186]: Received disconnect from 23.95.37.90 port 57058:11: Bye Bye [preauth]
Dec 03 00:19:18 compute-0 sshd-session[220186]: Disconnected from invalid user admin1 23.95.37.90 port 57058 [preauth]
Dec 03 00:19:18 compute-0 sshd-session[220188]: Received disconnect from 20.123.120.169 port 60600:11: Bye Bye [preauth]
Dec 03 00:19:18 compute-0 sshd-session[220188]: Disconnected from authenticating user root 20.123.120.169 port 60600 [preauth]
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.860 187247 DEBUG nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi5zwnlar',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c9a442a2-b67f-45a9-a7b3-2f866d137327',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(7fa12d64-4350-4688-9062-94b18965be36),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:19:18 compute-0 nova_compute[187243]: 2025-12-03 00:19:18.861 187247 WARNING neutronclient.v2_0.client [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.017 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.380 187247 DEBUG nova.objects.instance [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid c9a442a2-b67f-45a9-a7b3-2f866d137327 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.382 187247 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.384 187247 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.384 187247 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.886 187247 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.886 187247 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.892 187247 DEBUG nova.virt.libvirt.vif [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:17:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1409069009',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1409069',id=26,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:18:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-82faa1lu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:18:06Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=c9a442a2-b67f-45a9-a7b3-2f866d137327,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.893 187247 DEBUG nova.network.os_vif_util [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.893 187247 DEBUG nova.network.os_vif_util [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.894 187247 DEBUG nova.virt.libvirt.migration [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:91:4e:2a"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <target dev="tapc926feac-0f"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]: </interface>
Dec 03 00:19:19 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.894 187247 DEBUG nova.virt.libvirt.migration [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <name>instance-0000001a</name>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <uuid>c9a442a2-b67f-45a9-a7b3-2f866d137327</uuid>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-1409069009</nova:name>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:18:00</nova:creationTime>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:19:19 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:19:19 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:port uuid="c926feac-0f5a-4138-a74f-f066c3bf5f80">
Dec 03 00:19:19 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <system>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="serial">c9a442a2-b67f-45a9-a7b3-2f866d137327</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="uuid">c9a442a2-b67f-45a9-a7b3-2f866d137327</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </system>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <os>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </os>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <features>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </features>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk.config"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:91:4e:2a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc926feac-0f"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/console.log" append="off"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </target>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/console.log" append="off"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </console>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </input>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <video>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </video>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]: </domain>
Dec 03 00:19:19 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.896 187247 DEBUG nova.virt.libvirt.migration [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <name>instance-0000001a</name>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <uuid>c9a442a2-b67f-45a9-a7b3-2f866d137327</uuid>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-1409069009</nova:name>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:18:00</nova:creationTime>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:19:19 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:19:19 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:port uuid="c926feac-0f5a-4138-a74f-f066c3bf5f80">
Dec 03 00:19:19 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <system>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="serial">c9a442a2-b67f-45a9-a7b3-2f866d137327</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="uuid">c9a442a2-b67f-45a9-a7b3-2f866d137327</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </system>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <os>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </os>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <features>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </features>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk.config"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:91:4e:2a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc926feac-0f"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/console.log" append="off"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </target>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/console.log" append="off"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </console>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </input>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <video>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </video>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]: </domain>
Dec 03 00:19:19 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.896 187247 DEBUG nova.virt.libvirt.migration [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <name>instance-0000001a</name>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <uuid>c9a442a2-b67f-45a9-a7b3-2f866d137327</uuid>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-1409069009</nova:name>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:18:00</nova:creationTime>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:19:19 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:19:19 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <nova:port uuid="c926feac-0f5a-4138-a74f-f066c3bf5f80">
Dec 03 00:19:19 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <system>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="serial">c9a442a2-b67f-45a9-a7b3-2f866d137327</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="uuid">c9a442a2-b67f-45a9-a7b3-2f866d137327</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </system>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <os>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </os>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <features>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </features>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/disk.config"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:91:4e:2a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc926feac-0f"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/console.log" append="off"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:19:19 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       </target>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327/console.log" append="off"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </console>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </input>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <video>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </video>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:19:19 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:19:19 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:19:19 compute-0 nova_compute[187243]: </domain>
Dec 03 00:19:19 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:19:19 compute-0 nova_compute[187243]: 2025-12-03 00:19:19.897 187247 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:19:20 compute-0 nova_compute[187243]: 2025-12-03 00:19:20.228 187247 WARNING neutronclient.v2_0.client [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:20 compute-0 nova_compute[187243]: 2025-12-03 00:19:20.389 187247 DEBUG nova.virt.libvirt.migration [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:19:20 compute-0 nova_compute[187243]: 2025-12-03 00:19:20.389 187247 INFO nova.virt.libvirt.migration [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:19:20 compute-0 nova_compute[187243]: 2025-12-03 00:19:20.409 187247 DEBUG nova.network.neutron [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Updated VIF entry in instance network info cache for port c926feac-0f5a-4138-a74f-f066c3bf5f80. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:19:20 compute-0 nova_compute[187243]: 2025-12-03 00:19:20.409 187247 DEBUG nova.network.neutron [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Updating instance_info_cache with network_info: [{"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:19:20 compute-0 nova_compute[187243]: 2025-12-03 00:19:20.916 187247 DEBUG oslo_concurrency.lockutils [req-f3af0cc5-a4f7-44b7-8d01-0022be42d6ab req-35caec3d-135f-419e-b670-226c0ebe5c33 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-c9a442a2-b67f-45a9-a7b3-2f866d137327" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.409 187247 INFO nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:19:21 compute-0 kernel: tapc926feac-0f (unregistering): left promiscuous mode
Dec 03 00:19:21 compute-0 NetworkManager[55671]: <info>  [1764721161.6929] device (tapc926feac-0f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:19:21 compute-0 ovn_controller[95488]: 2025-12-03T00:19:21Z|00202|binding|INFO|Releasing lport c926feac-0f5a-4138-a74f-f066c3bf5f80 from this chassis (sb_readonly=0)
Dec 03 00:19:21 compute-0 ovn_controller[95488]: 2025-12-03T00:19:21Z|00203|binding|INFO|Setting lport c926feac-0f5a-4138-a74f-f066c3bf5f80 down in Southbound
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.704 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:21 compute-0 ovn_controller[95488]: 2025-12-03T00:19:21Z|00204|binding|INFO|Removing iface tapc926feac-0f ovn-installed in OVS
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.705 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.720 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:4e:2a 10.100.0.9'], port_security=['fa:16:3e:91:4e:2a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c9a442a2-b67f-45a9-a7b3-2f866d137327', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=c926feac-0f5a-4138-a74f-f066c3bf5f80) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.722 104379 INFO neutron.agent.ovn.metadata.agent [-] Port c926feac-0f5a-4138-a74f-f066c3bf5f80 in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 unbound from our chassis
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.723 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.724 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.739 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a94a0b6d-4856-43cc-9350-9174c1c542c2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:21 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Dec 03 00:19:21 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001a.scope: Consumed 14.762s CPU time.
Dec 03 00:19:21 compute-0 systemd-machined[153518]: Machine qemu-17-instance-0000001a terminated.
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.767 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[5f78f756-cc6b-4fe7-8bba-a622d421dd5e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.770 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[e752d116-38e3-42d6-9cfb-e2bcf35b2025]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.791 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[000510cb-e9a1-4ad1-bc5d-ec184ab40d0e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.809 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[25154721-37b2-4ab1-964c-f53c7b2c9f11]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517500, 'reachable_time': 28846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220217, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.822 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f73d9ad4-5427-4fb1-b125-754f04bd403f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517511, 'tstamp': 517511}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220218, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517516, 'tstamp': 517516}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220218, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.824 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.825 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.831 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.831 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ff943d-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.831 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.831 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7ff943d-e0, col_values=(('external_ids', {'iface-id': '636cd919-869d-4a8a-92fa-ec7c18804da5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.832 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.833 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ebea9b9a-6f83-4ae0-8be6-251d95f46966]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.857 187247 DEBUG nova.compute.manager [req-73a574ea-88de-4a83-b016-25808b55a35e req-d33bd1c4-05d8-4687-b190-8740f95ac037 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.857 187247 DEBUG oslo_concurrency.lockutils [req-73a574ea-88de-4a83-b016-25808b55a35e req-d33bd1c4-05d8-4687-b190-8740f95ac037 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.857 187247 DEBUG oslo_concurrency.lockutils [req-73a574ea-88de-4a83-b016-25808b55a35e req-d33bd1c4-05d8-4687-b190-8740f95ac037 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.858 187247 DEBUG oslo_concurrency.lockutils [req-73a574ea-88de-4a83-b016-25808b55a35e req-d33bd1c4-05d8-4687-b190-8740f95ac037 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.858 187247 DEBUG nova.compute.manager [req-73a574ea-88de-4a83-b016-25808b55a35e req-d33bd1c4-05d8-4687-b190-8740f95ac037 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] No waiting events found dispatching network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.858 187247 DEBUG nova.compute.manager [req-73a574ea-88de-4a83-b016-25808b55a35e req-d33bd1c4-05d8-4687-b190-8740f95ac037 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.889 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.895 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.935 187247 DEBUG nova.virt.libvirt.guest [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.936 187247 INFO nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Migration operation has completed
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.936 187247 INFO nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] _post_live_migration() is started..
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.938 187247 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.938 187247 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.938 187247 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.950 187247 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.950 187247 WARNING neutronclient.v2_0.client [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.986 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:19:21 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:21.987 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:19:21 compute-0 nova_compute[187243]: 2025-12-03 00:19:21.987 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.307 187247 DEBUG nova.network.neutron [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port c926feac-0f5a-4138-a74f-f066c3bf5f80 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.307 187247 DEBUG nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.308 187247 DEBUG nova.virt.libvirt.vif [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:17:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1409069009',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1409069',id=26,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:18:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-82faa1lu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:18:58Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=c9a442a2-b67f-45a9-a7b3-2f866d137327,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.308 187247 DEBUG nova.network.os_vif_util [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "address": "fa:16:3e:91:4e:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc926feac-0f", "ovs_interfaceid": "c926feac-0f5a-4138-a74f-f066c3bf5f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.309 187247 DEBUG nova.network.os_vif_util [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.309 187247 DEBUG os_vif [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.311 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.311 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc926feac-0f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.312 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.314 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.315 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.315 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=1805a310-30d9-4e18-9fd8-c6d18514748d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.316 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.317 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.319 187247 INFO os_vif [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:4e:2a,bridge_name='br-int',has_traffic_filtering=True,id=c926feac-0f5a-4138-a74f-f066c3bf5f80,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc926feac-0f')
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.319 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.319 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.319 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.320 187247 DEBUG nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.320 187247 INFO nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Deleting instance files /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327_del
Dec 03 00:19:22 compute-0 nova_compute[187243]: 2025-12-03 00:19:22.320 187247 INFO nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Deletion of /var/lib/nova/instances/c9a442a2-b67f-45a9-a7b3-2f866d137327_del complete
Dec 03 00:19:23 compute-0 podman[220237]: 2025-12-03 00:19:23.101378735 +0000 UTC m=+0.060316248 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:19:23 compute-0 podman[220238]: 2025-12-03 00:19:23.12942572 +0000 UTC m=+0.083959841 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, config_id=edpm, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.907 187247 DEBUG nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.907 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.907 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.908 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.908 187247 DEBUG nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] No waiting events found dispatching network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.908 187247 WARNING nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received unexpected event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 for instance with vm_state active and task_state migrating.
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.908 187247 DEBUG nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.908 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.909 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.909 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.909 187247 DEBUG nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] No waiting events found dispatching network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.909 187247 DEBUG nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.909 187247 DEBUG nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.910 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.910 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.910 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.910 187247 DEBUG nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] No waiting events found dispatching network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.910 187247 DEBUG nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-unplugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.911 187247 DEBUG nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.911 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.911 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.911 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.911 187247 DEBUG nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] No waiting events found dispatching network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.912 187247 WARNING nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received unexpected event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 for instance with vm_state active and task_state migrating.
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.912 187247 DEBUG nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.912 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.912 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.912 187247 DEBUG oslo_concurrency.lockutils [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.912 187247 DEBUG nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] No waiting events found dispatching network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:19:23 compute-0 nova_compute[187243]: 2025-12-03 00:19:23.913 187247 WARNING nova.compute.manager [req-9a490f9b-1ff3-4f92-ac85-d1069b8247f7 req-49c6be33-bedb-43ef-b2f7-eeb5e2c1d7d5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Received unexpected event network-vif-plugged-c926feac-0f5a-4138-a74f-f066c3bf5f80 for instance with vm_state active and task_state migrating.
Dec 03 00:19:24 compute-0 nova_compute[187243]: 2025-12-03 00:19:24.020 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:27 compute-0 nova_compute[187243]: 2025-12-03 00:19:27.316 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:27.988 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:29 compute-0 nova_compute[187243]: 2025-12-03 00:19:29.022 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:29 compute-0 podman[197600]: time="2025-12-03T00:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:19:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:19:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3071 "" "Go-http-client/1.1"
Dec 03 00:19:31 compute-0 openstack_network_exporter[199746]: ERROR   00:19:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:19:31 compute-0 openstack_network_exporter[199746]: ERROR   00:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:19:31 compute-0 openstack_network_exporter[199746]: ERROR   00:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:19:31 compute-0 openstack_network_exporter[199746]: ERROR   00:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:19:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:19:31 compute-0 openstack_network_exporter[199746]: ERROR   00:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:19:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:19:31 compute-0 nova_compute[187243]: 2025-12-03 00:19:31.852 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:31 compute-0 nova_compute[187243]: 2025-12-03 00:19:31.853 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:31 compute-0 nova_compute[187243]: 2025-12-03 00:19:31.853 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "c9a442a2-b67f-45a9-a7b3-2f866d137327-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:32 compute-0 sshd-session[220285]: Invalid user janice from 49.247.36.49 port 55369
Dec 03 00:19:32 compute-0 sshd-session[220285]: Received disconnect from 49.247.36.49 port 55369:11: Bye Bye [preauth]
Dec 03 00:19:32 compute-0 sshd-session[220285]: Disconnected from invalid user janice 49.247.36.49 port 55369 [preauth]
Dec 03 00:19:32 compute-0 nova_compute[187243]: 2025-12-03 00:19:32.319 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:32 compute-0 nova_compute[187243]: 2025-12-03 00:19:32.364 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:32 compute-0 nova_compute[187243]: 2025-12-03 00:19:32.365 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:32 compute-0 nova_compute[187243]: 2025-12-03 00:19:32.365 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:32 compute-0 nova_compute[187243]: 2025-12-03 00:19:32.365 187247 DEBUG nova.compute.resource_tracker [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:19:33 compute-0 nova_compute[187243]: 2025-12-03 00:19:33.402 187247 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:33 compute-0 nova_compute[187243]: 2025-12-03 00:19:33.454 187247 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:33 compute-0 nova_compute[187243]: 2025-12-03 00:19:33.455 187247 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:33 compute-0 nova_compute[187243]: 2025-12-03 00:19:33.508 187247 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:33 compute-0 nova_compute[187243]: 2025-12-03 00:19:33.631 187247 WARNING nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:19:33 compute-0 nova_compute[187243]: 2025-12-03 00:19:33.632 187247 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:33 compute-0 nova_compute[187243]: 2025-12-03 00:19:33.648 187247 DEBUG oslo_concurrency.processutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:33 compute-0 nova_compute[187243]: 2025-12-03 00:19:33.648 187247 DEBUG nova.compute.resource_tracker [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5667MB free_disk=73.1335678100586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:19:33 compute-0 nova_compute[187243]: 2025-12-03 00:19:33.649 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:33 compute-0 nova_compute[187243]: 2025-12-03 00:19:33.649 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:34 compute-0 nova_compute[187243]: 2025-12-03 00:19:34.074 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:34 compute-0 nova_compute[187243]: 2025-12-03 00:19:34.768 187247 DEBUG nova.compute.resource_tracker [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance c9a442a2-b67f-45a9-a7b3-2f866d137327 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:19:35 compute-0 podman[220295]: 2025-12-03 00:19:35.101425344 +0000 UTC m=+0.056889731 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:19:35 compute-0 nova_compute[187243]: 2025-12-03 00:19:35.279 187247 DEBUG nova.compute.resource_tracker [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:19:35 compute-0 nova_compute[187243]: 2025-12-03 00:19:35.279 187247 INFO nova.compute.resource_tracker [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Updating resource usage from migration d693345a-fa5e-4845-a60d-9331bd660235
Dec 03 00:19:35 compute-0 nova_compute[187243]: 2025-12-03 00:19:35.313 187247 DEBUG nova.compute.resource_tracker [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration d693345a-fa5e-4845-a60d-9331bd660235 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:19:35 compute-0 nova_compute[187243]: 2025-12-03 00:19:35.313 187247 DEBUG nova.compute.resource_tracker [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration 7fa12d64-4350-4688-9062-94b18965be36 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:19:35 compute-0 nova_compute[187243]: 2025-12-03 00:19:35.313 187247 DEBUG nova.compute.resource_tracker [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:19:35 compute-0 nova_compute[187243]: 2025-12-03 00:19:35.314 187247 DEBUG nova.compute.resource_tracker [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:19:33 up  1:27,  0 user,  load average: 0.41, 0.33, 0.30\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_e363b47741a1476ca7e5987b6d15acb5': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:19:35 compute-0 nova_compute[187243]: 2025-12-03 00:19:35.368 187247 DEBUG nova.compute.provider_tree [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:19:35 compute-0 nova_compute[187243]: 2025-12-03 00:19:35.885 187247 DEBUG nova.scheduler.client.report [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:19:36 compute-0 nova_compute[187243]: 2025-12-03 00:19:36.393 187247 DEBUG nova.compute.resource_tracker [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:19:36 compute-0 nova_compute[187243]: 2025-12-03 00:19:36.394 187247 DEBUG oslo_concurrency.lockutils [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.745s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:36 compute-0 nova_compute[187243]: 2025-12-03 00:19:36.411 187247 INFO nova.compute.manager [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 03 00:19:37 compute-0 nova_compute[187243]: 2025-12-03 00:19:37.321 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:37 compute-0 nova_compute[187243]: 2025-12-03 00:19:37.509 187247 INFO nova.scheduler.client.report [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration 7fa12d64-4350-4688-9062-94b18965be36
Dec 03 00:19:37 compute-0 nova_compute[187243]: 2025-12-03 00:19:37.510 187247 DEBUG nova.virt.libvirt.driver [None req-2600a28e-d15b-4ffa-a80c-70a2f0b9c7d5 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: c9a442a2-b67f-45a9-a7b3-2f866d137327] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:19:38 compute-0 nova_compute[187243]: 2025-12-03 00:19:38.528 187247 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:38 compute-0 nova_compute[187243]: 2025-12-03 00:19:38.587 187247 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:38 compute-0 nova_compute[187243]: 2025-12-03 00:19:38.588 187247 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:38 compute-0 nova_compute[187243]: 2025-12-03 00:19:38.640 187247 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:38 compute-0 nova_compute[187243]: 2025-12-03 00:19:38.641 187247 DEBUG nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Preparing to wait for external event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:19:38 compute-0 nova_compute[187243]: 2025-12-03 00:19:38.642 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:38 compute-0 nova_compute[187243]: 2025-12-03 00:19:38.642 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:38 compute-0 nova_compute[187243]: 2025-12-03 00:19:38.642 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:39 compute-0 nova_compute[187243]: 2025-12-03 00:19:39.074 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:40 compute-0 podman[220326]: 2025-12-03 00:19:40.122579409 +0000 UTC m=+0.079401085 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 03 00:19:40 compute-0 podman[220327]: 2025-12-03 00:19:40.128413708 +0000 UTC m=+0.083221473 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 03 00:19:42 compute-0 nova_compute[187243]: 2025-12-03 00:19:42.322 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:44 compute-0 nova_compute[187243]: 2025-12-03 00:19:44.075 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:45 compute-0 nova_compute[187243]: 2025-12-03 00:19:45.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:46 compute-0 nova_compute[187243]: 2025-12-03 00:19:46.212 187247 DEBUG nova.compute.manager [req-da235189-31cf-4691-873f-3fa5936041a8 req-8b5bdc3b-a3ba-426c-b4c0-f28942dd5d09 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:46 compute-0 nova_compute[187243]: 2025-12-03 00:19:46.213 187247 DEBUG oslo_concurrency.lockutils [req-da235189-31cf-4691-873f-3fa5936041a8 req-8b5bdc3b-a3ba-426c-b4c0-f28942dd5d09 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:46 compute-0 nova_compute[187243]: 2025-12-03 00:19:46.213 187247 DEBUG oslo_concurrency.lockutils [req-da235189-31cf-4691-873f-3fa5936041a8 req-8b5bdc3b-a3ba-426c-b4c0-f28942dd5d09 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:46 compute-0 nova_compute[187243]: 2025-12-03 00:19:46.213 187247 DEBUG oslo_concurrency.lockutils [req-da235189-31cf-4691-873f-3fa5936041a8 req-8b5bdc3b-a3ba-426c-b4c0-f28942dd5d09 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:46 compute-0 nova_compute[187243]: 2025-12-03 00:19:46.214 187247 DEBUG nova.compute.manager [req-da235189-31cf-4691-873f-3fa5936041a8 req-8b5bdc3b-a3ba-426c-b4c0-f28942dd5d09 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] No event matching network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e in dict_keys([('network-vif-plugged', '8b0adcad-4e57-4150-b6d7-890ceb893e2e')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:19:46 compute-0 nova_compute[187243]: 2025-12-03 00:19:46.214 187247 DEBUG nova.compute.manager [req-da235189-31cf-4691-873f-3fa5936041a8 req-8b5bdc3b-a3ba-426c-b4c0-f28942dd5d09 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:19:47 compute-0 nova_compute[187243]: 2025-12-03 00:19:47.165 187247 INFO nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Took 8.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 03 00:19:47 compute-0 nova_compute[187243]: 2025-12-03 00:19:47.324 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.271 187247 DEBUG nova.compute.manager [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.272 187247 DEBUG oslo_concurrency.lockutils [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.272 187247 DEBUG oslo_concurrency.lockutils [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.272 187247 DEBUG oslo_concurrency.lockutils [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.273 187247 DEBUG nova.compute.manager [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Processing event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.273 187247 DEBUG nova.compute.manager [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-changed-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.273 187247 DEBUG nova.compute.manager [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Refreshing instance network info cache due to event network-changed-8b0adcad-4e57-4150-b6d7-890ceb893e2e. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.273 187247 DEBUG oslo_concurrency.lockutils [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.273 187247 DEBUG oslo_concurrency.lockutils [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.274 187247 DEBUG nova.network.neutron [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Refreshing network info cache for port 8b0adcad-4e57-4150-b6d7-890ceb893e2e _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.275 187247 DEBUG nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.780 187247 WARNING neutronclient.v2_0.client [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:48 compute-0 nova_compute[187243]: 2025-12-03 00:19:48.785 187247 DEBUG nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpazvfqycv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e3ecd0e-4de1-44c9-805b-8d695da6b95e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(d693345a-fa5e-4845-a60d-9331bd660235),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:19:49 compute-0 nova_compute[187243]: 2025-12-03 00:19:49.076 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:49 compute-0 nova_compute[187243]: 2025-12-03 00:19:49.284 187247 WARNING neutronclient.v2_0.client [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:49 compute-0 nova_compute[187243]: 2025-12-03 00:19:49.299 187247 DEBUG nova.objects.instance [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 2e3ecd0e-4de1-44c9-805b-8d695da6b95e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:19:49 compute-0 nova_compute[187243]: 2025-12-03 00:19:49.300 187247 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:19:49 compute-0 nova_compute[187243]: 2025-12-03 00:19:49.301 187247 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:19:49 compute-0 nova_compute[187243]: 2025-12-03 00:19:49.302 187247 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.102 187247 DEBUG nova.network.neutron [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Updated VIF entry in instance network info cache for port 8b0adcad-4e57-4150-b6d7-890ceb893e2e. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.102 187247 DEBUG nova.network.neutron [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Updating instance_info_cache with network_info: [{"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.104 187247 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.104 187247 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.105 187247 DEBUG nova.virt.libvirt.vif [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:18:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-104589744',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1045897',id=27,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:18:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-78r0m2qy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:18:33Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=2e3ecd0e-4de1-44c9-805b-8d695da6b95e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.106 187247 DEBUG nova.network.os_vif_util [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.106 187247 DEBUG nova.network.os_vif_util [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.107 187247 DEBUG nova.virt.libvirt.migration [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:9d:ff:2c"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <target dev="tap8b0adcad-4e"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]: </interface>
Dec 03 00:19:50 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.107 187247 DEBUG nova.virt.libvirt.migration [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <name>instance-0000001b</name>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <uuid>2e3ecd0e-4de1-44c9-805b-8d695da6b95e</uuid>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-104589744</nova:name>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:18:27</nova:creationTime>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:19:50 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:19:50 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:port uuid="8b0adcad-4e57-4150-b6d7-890ceb893e2e">
Dec 03 00:19:50 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <system>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="serial">2e3ecd0e-4de1-44c9-805b-8d695da6b95e</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="uuid">2e3ecd0e-4de1-44c9-805b-8d695da6b95e</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </system>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <os>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </os>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <features>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </features>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk.config"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:9d:ff:2c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8b0adcad-4e"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/console.log" append="off"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </target>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/console.log" append="off"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </console>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </input>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <video>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </video>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]: </domain>
Dec 03 00:19:50 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.109 187247 DEBUG nova.virt.libvirt.migration [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <name>instance-0000001b</name>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <uuid>2e3ecd0e-4de1-44c9-805b-8d695da6b95e</uuid>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-104589744</nova:name>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:18:27</nova:creationTime>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:19:50 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:19:50 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:port uuid="8b0adcad-4e57-4150-b6d7-890ceb893e2e">
Dec 03 00:19:50 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <system>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="serial">2e3ecd0e-4de1-44c9-805b-8d695da6b95e</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="uuid">2e3ecd0e-4de1-44c9-805b-8d695da6b95e</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </system>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <os>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </os>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <features>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </features>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk.config"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:9d:ff:2c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8b0adcad-4e"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/console.log" append="off"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </target>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/console.log" append="off"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </console>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </input>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <video>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </video>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]: </domain>
Dec 03 00:19:50 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.109 187247 DEBUG nova.virt.libvirt.migration [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <name>instance-0000001b</name>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <uuid>2e3ecd0e-4de1-44c9-805b-8d695da6b95e</uuid>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-104589744</nova:name>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:18:27</nova:creationTime>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:19:50 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:19:50 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <nova:port uuid="8b0adcad-4e57-4150-b6d7-890ceb893e2e">
Dec 03 00:19:50 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <system>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="serial">2e3ecd0e-4de1-44c9-805b-8d695da6b95e</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="uuid">2e3ecd0e-4de1-44c9-805b-8d695da6b95e</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </system>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <os>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </os>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <features>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </features>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/disk.config"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:9d:ff:2c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8b0adcad-4e"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/console.log" append="off"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:19:50 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       </target>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e/console.log" append="off"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </console>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </input>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <video>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </video>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:19:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:19:50 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:19:50 compute-0 nova_compute[187243]: </domain>
Dec 03 00:19:50 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.110 187247 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.606 187247 DEBUG nova.virt.libvirt.migration [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.607 187247 INFO nova.virt.libvirt.migration [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:19:50 compute-0 nova_compute[187243]: 2025-12-03 00:19:50.612 187247 DEBUG oslo_concurrency.lockutils [req-740dc21c-140d-460c-a72d-1b2dcd95a3ca req-d4ef7a59-374e-43c8-b8aa-342dd2df536b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-2e3ecd0e-4de1-44c9-805b-8d695da6b95e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:19:51 compute-0 nova_compute[187243]: 2025-12-03 00:19:51.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:51 compute-0 nova_compute[187243]: 2025-12-03 00:19:51.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:19:51 compute-0 nova_compute[187243]: 2025-12-03 00:19:51.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:51 compute-0 nova_compute[187243]: 2025-12-03 00:19:51.625 187247 INFO nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.128 187247 DEBUG nova.virt.libvirt.migration [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.129 187247 DEBUG nova.virt.libvirt.migration [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 03 00:19:52 compute-0 kernel: tap8b0adcad-4e (unregistering): left promiscuous mode
Dec 03 00:19:52 compute-0 NetworkManager[55671]: <info>  [1764721192.1974] device (tap8b0adcad-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.199 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.199 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.199 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.200 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:19:52 compute-0 ovn_controller[95488]: 2025-12-03T00:19:52Z|00205|binding|INFO|Releasing lport 8b0adcad-4e57-4150-b6d7-890ceb893e2e from this chassis (sb_readonly=0)
Dec 03 00:19:52 compute-0 ovn_controller[95488]: 2025-12-03T00:19:52Z|00206|binding|INFO|Setting lport 8b0adcad-4e57-4150-b6d7-890ceb893e2e down in Southbound
Dec 03 00:19:52 compute-0 ovn_controller[95488]: 2025-12-03T00:19:52Z|00207|binding|INFO|Removing iface tap8b0adcad-4e ovn-installed in OVS
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.207 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.213 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:ff:2c 10.100.0.10'], port_security=['fa:16:3e:9d:ff:2c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2e3ecd0e-4de1-44c9-805b-8d695da6b95e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=8b0adcad-4e57-4150-b6d7-890ceb893e2e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.214 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 8b0adcad-4e57-4150-b6d7-890ceb893e2e in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 unbound from our chassis
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.215 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.216 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb27e6b-6582-4f68-a058-92f753c5a2a0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.216 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 namespace which is not needed anymore
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.221 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:52 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Dec 03 00:19:52 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001b.scope: Consumed 15.723s CPU time.
Dec 03 00:19:52 compute-0 systemd-machined[153518]: Machine qemu-18-instance-0000001b terminated.
Dec 03 00:19:52 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219792]: [NOTICE]   (219796) : haproxy version is 3.0.5-8e879a5
Dec 03 00:19:52 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219792]: [NOTICE]   (219796) : path to executable is /usr/sbin/haproxy
Dec 03 00:19:52 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219792]: [WARNING]  (219796) : Exiting Master process...
Dec 03 00:19:52 compute-0 podman[220404]: 2025-12-03 00:19:52.31660767 +0000 UTC m=+0.027554813 container kill 9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 03 00:19:52 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219792]: [ALERT]    (219796) : Current worker (219798) exited with code 143 (Terminated)
Dec 03 00:19:52 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[219792]: [WARNING]  (219796) : All workers exited. Exiting... (0)
Dec 03 00:19:52 compute-0 systemd[1]: libpod-9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2.scope: Deactivated successfully.
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.371 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.381 187247 DEBUG nova.compute.manager [req-bbdc24e0-0af7-4492-8026-3d23e97c61f9 req-e2723f15-235c-4964-9401-8ad1d94816db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.382 187247 DEBUG oslo_concurrency.lockutils [req-bbdc24e0-0af7-4492-8026-3d23e97c61f9 req-e2723f15-235c-4964-9401-8ad1d94816db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.382 187247 DEBUG oslo_concurrency.lockutils [req-bbdc24e0-0af7-4492-8026-3d23e97c61f9 req-e2723f15-235c-4964-9401-8ad1d94816db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.382 187247 DEBUG oslo_concurrency.lockutils [req-bbdc24e0-0af7-4492-8026-3d23e97c61f9 req-e2723f15-235c-4964-9401-8ad1d94816db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.382 187247 DEBUG nova.compute.manager [req-bbdc24e0-0af7-4492-8026-3d23e97c61f9 req-e2723f15-235c-4964-9401-8ad1d94816db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] No waiting events found dispatching network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.383 187247 DEBUG nova.compute.manager [req-bbdc24e0-0af7-4492-8026-3d23e97c61f9 req-e2723f15-235c-4964-9401-8ad1d94816db 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:19:52 compute-0 podman[220417]: 2025-12-03 00:19:52.407426885 +0000 UTC m=+0.023774667 container died 9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.428 187247 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.429 187247 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.429 187247 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:19:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2-userdata-shm.mount: Deactivated successfully.
Dec 03 00:19:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-cef287c549175ae16bcd4efac9406ba74739717802e4a4fd7bbbf1ee200c6b92-merged.mount: Deactivated successfully.
Dec 03 00:19:52 compute-0 podman[220417]: 2025-12-03 00:19:52.454050664 +0000 UTC m=+0.070398436 container remove 9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 03 00:19:52 compute-0 systemd[1]: libpod-conmon-9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2.scope: Deactivated successfully.
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.461 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7328146c-eb57-41dd-a2dd-a2cdfde966a0]: (4, ("Wed Dec  3 12:19:52 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 (9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2)\n9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2\nWed Dec  3 12:19:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 (9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2)\n9fe4885336c26b5505b23cc7dc1320a25bd7a7792b24edc50467572ed77b15c2\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.463 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[509264b7-1517-4f34-9d19-1f84a9fe45d5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.463 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.464 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5f1087-7d8b-4b69-972a-82722e436a6f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.465 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.466 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:52 compute-0 kernel: tapf7ff943d-e0: left promiscuous mode
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.482 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.485 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ce04b922-e045-421c-a09b-4715bd32b6eb]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.502 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4a00f940-8ba6-4cd6-8290-28c254c3644e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.503 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c3bbf667-caa4-4d74-9ad7-9f08629e8235]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.517 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[dc207c04-0fbe-4da3-982d-b1fbca418911]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517492, 'reachable_time': 19832, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220464, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.520 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:19:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:19:52.520 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[7afbd75b-7471-42d7-bd79-61b3b32c3b48]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:19:52 compute-0 systemd[1]: run-netns-ovnmeta\x2df7ff943d\x2de57d\x2d4bc2\x2d8dd6\x2df8a8bb6e4f89.mount: Deactivated successfully.
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.583 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.584 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.603 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.603 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5806MB free_disk=73.1335678100586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.604 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.604 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.631 187247 DEBUG nova.virt.libvirt.guest [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '2e3ecd0e-4de1-44c9-805b-8d695da6b95e' (instance-0000001b) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.631 187247 INFO nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Migration operation has completed
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.632 187247 INFO nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] _post_live_migration() is started..
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.644 187247 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:52 compute-0 nova_compute[187243]: 2025-12-03 00:19:52.645 187247 WARNING neutronclient.v2_0.client [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.262 187247 DEBUG nova.network.neutron [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port 8b0adcad-4e57-4150-b6d7-890ceb893e2e and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.263 187247 DEBUG nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.263 187247 DEBUG nova.virt.libvirt.vif [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:18:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-104589744',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1045897',id=27,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:18:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-78r0m2qy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:18:58Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=2e3ecd0e-4de1-44c9-805b-8d695da6b95e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.264 187247 DEBUG nova.network.os_vif_util [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "address": "fa:16:3e:9d:ff:2c", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b0adcad-4e", "ovs_interfaceid": "8b0adcad-4e57-4150-b6d7-890ceb893e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.264 187247 DEBUG nova.network.os_vif_util [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.265 187247 DEBUG os_vif [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.266 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.267 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b0adcad-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.268 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.270 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.271 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.271 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=87f5ac60-fb7e-4bce-883d-d28d23c59851) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.271 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.272 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.274 187247 INFO os_vif [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:ff:2c,bridge_name='br-int',has_traffic_filtering=True,id=8b0adcad-4e57-4150-b6d7-890ceb893e2e,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b0adcad-4e')
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.274 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.621 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Updating resource usage from migration d693345a-fa5e-4845-a60d-9331bd660235
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.704 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration d693345a-fa5e-4845-a60d-9331bd660235 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.704 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.705 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:19:52 up  1:28,  0 user,  load average: 0.29, 0.31, 0.29\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_e363b47741a1476ca7e5987b6d15acb5': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:19:53 compute-0 nova_compute[187243]: 2025-12-03 00:19:53.735 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.079 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:54 compute-0 podman[220466]: 2025-12-03 00:19:54.103610144 +0000 UTC m=+0.054408408 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:19:54 compute-0 podman[220467]: 2025-12-03 00:19:54.103526112 +0000 UTC m=+0.052584341 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.242 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.449 187247 DEBUG nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.450 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.450 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.451 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.451 187247 DEBUG nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] No waiting events found dispatching network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.451 187247 WARNING nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received unexpected event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e for instance with vm_state active and task_state migrating.
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.452 187247 DEBUG nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.452 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.452 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.453 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.453 187247 DEBUG nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] No waiting events found dispatching network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.454 187247 DEBUG nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.454 187247 DEBUG nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.454 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.455 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.455 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.455 187247 DEBUG nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] No waiting events found dispatching network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.456 187247 DEBUG nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-unplugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.456 187247 DEBUG nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.456 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.457 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.457 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.457 187247 DEBUG nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] No waiting events found dispatching network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.458 187247 WARNING nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received unexpected event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e for instance with vm_state active and task_state migrating.
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.458 187247 DEBUG nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.458 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.459 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.459 187247 DEBUG oslo_concurrency.lockutils [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.459 187247 DEBUG nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] No waiting events found dispatching network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.460 187247 WARNING nova.compute.manager [req-9b2d8816-6db6-4b91-ae49-c68e015db6f6 req-cd657e85-7ab5-479e-b921-4291d2a55595 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Received unexpected event network-vif-plugged-8b0adcad-4e57-4150-b6d7-890ceb893e2e for instance with vm_state active and task_state migrating.
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.754 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.755 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.151s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.755 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 1.481s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.755 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.756 187247 DEBUG nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.756 187247 INFO nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Deleting instance files /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e_del
Dec 03 00:19:54 compute-0 nova_compute[187243]: 2025-12-03 00:19:54.757 187247 INFO nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Deletion of /var/lib/nova/instances/2e3ecd0e-4de1-44c9-805b-8d695da6b95e_del complete
Dec 03 00:19:57 compute-0 nova_compute[187243]: 2025-12-03 00:19:57.761 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:57 compute-0 nova_compute[187243]: 2025-12-03 00:19:57.761 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:58 compute-0 nova_compute[187243]: 2025-12-03 00:19:58.272 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:59 compute-0 nova_compute[187243]: 2025-12-03 00:19:59.080 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:19:59 compute-0 nova_compute[187243]: 2025-12-03 00:19:59.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:19:59 compute-0 podman[197600]: time="2025-12-03T00:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:19:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:19:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Dec 03 00:20:00 compute-0 sshd-session[220508]: Invalid user syncuser from 61.220.235.10 port 54814
Dec 03 00:20:00 compute-0 sshd-session[220508]: Received disconnect from 61.220.235.10 port 54814:11: Bye Bye [preauth]
Dec 03 00:20:00 compute-0 sshd-session[220508]: Disconnected from invalid user syncuser 61.220.235.10 port 54814 [preauth]
Dec 03 00:20:00 compute-0 nova_compute[187243]: 2025-12-03 00:20:00.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:20:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:00.719 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:00.719 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:00.720 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:01 compute-0 openstack_network_exporter[199746]: ERROR   00:20:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:20:01 compute-0 openstack_network_exporter[199746]: ERROR   00:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:20:01 compute-0 openstack_network_exporter[199746]: ERROR   00:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:20:01 compute-0 openstack_network_exporter[199746]: ERROR   00:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:20:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:20:01 compute-0 openstack_network_exporter[199746]: ERROR   00:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:20:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:20:02 compute-0 sshd-session[220510]: Received disconnect from 102.210.148.92 port 44134:11: Bye Bye [preauth]
Dec 03 00:20:02 compute-0 sshd-session[220510]: Disconnected from authenticating user root 102.210.148.92 port 44134 [preauth]
Dec 03 00:20:03 compute-0 nova_compute[187243]: 2025-12-03 00:20:03.273 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:04 compute-0 nova_compute[187243]: 2025-12-03 00:20:04.094 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:05 compute-0 nova_compute[187243]: 2025-12-03 00:20:05.293 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:05 compute-0 nova_compute[187243]: 2025-12-03 00:20:05.293 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:05 compute-0 nova_compute[187243]: 2025-12-03 00:20:05.294 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "2e3ecd0e-4de1-44c9-805b-8d695da6b95e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:05 compute-0 nova_compute[187243]: 2025-12-03 00:20:05.810 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:05 compute-0 nova_compute[187243]: 2025-12-03 00:20:05.811 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:05 compute-0 nova_compute[187243]: 2025-12-03 00:20:05.811 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:05 compute-0 nova_compute[187243]: 2025-12-03 00:20:05.811 187247 DEBUG nova.compute.resource_tracker [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:20:05 compute-0 nova_compute[187243]: 2025-12-03 00:20:05.994 187247 WARNING nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:20:05 compute-0 nova_compute[187243]: 2025-12-03 00:20:05.995 187247 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:06 compute-0 nova_compute[187243]: 2025-12-03 00:20:06.014 187247 DEBUG oslo_concurrency.processutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:06 compute-0 nova_compute[187243]: 2025-12-03 00:20:06.014 187247 DEBUG nova.compute.resource_tracker [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5820MB free_disk=73.16234588623047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:20:06 compute-0 nova_compute[187243]: 2025-12-03 00:20:06.014 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:06 compute-0 nova_compute[187243]: 2025-12-03 00:20:06.015 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:06 compute-0 podman[220514]: 2025-12-03 00:20:06.098021971 +0000 UTC m=+0.059530629 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:20:07 compute-0 nova_compute[187243]: 2025-12-03 00:20:07.034 187247 DEBUG nova.compute.resource_tracker [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance 2e3ecd0e-4de1-44c9-805b-8d695da6b95e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:20:07 compute-0 nova_compute[187243]: 2025-12-03 00:20:07.543 187247 DEBUG nova.compute.resource_tracker [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:20:07 compute-0 nova_compute[187243]: 2025-12-03 00:20:07.768 187247 DEBUG nova.compute.resource_tracker [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration d693345a-fa5e-4845-a60d-9331bd660235 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:20:07 compute-0 nova_compute[187243]: 2025-12-03 00:20:07.769 187247 DEBUG nova.compute.resource_tracker [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:20:07 compute-0 nova_compute[187243]: 2025-12-03 00:20:07.769 187247 DEBUG nova.compute.resource_tracker [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:20:06 up  1:28,  0 user,  load average: 0.23, 0.29, 0.28\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:20:07 compute-0 nova_compute[187243]: 2025-12-03 00:20:07.813 187247 DEBUG nova.compute.provider_tree [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:20:08 compute-0 nova_compute[187243]: 2025-12-03 00:20:08.276 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:08 compute-0 nova_compute[187243]: 2025-12-03 00:20:08.328 187247 DEBUG nova.scheduler.client.report [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:20:08 compute-0 nova_compute[187243]: 2025-12-03 00:20:08.841 187247 DEBUG nova.compute.resource_tracker [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:20:08 compute-0 nova_compute[187243]: 2025-12-03 00:20:08.842 187247 DEBUG oslo_concurrency.lockutils [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.827s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:08 compute-0 nova_compute[187243]: 2025-12-03 00:20:08.864 187247 INFO nova.compute.manager [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 03 00:20:09 compute-0 nova_compute[187243]: 2025-12-03 00:20:09.096 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:09 compute-0 nova_compute[187243]: 2025-12-03 00:20:09.959 187247 INFO nova.scheduler.client.report [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration d693345a-fa5e-4845-a60d-9331bd660235
Dec 03 00:20:09 compute-0 nova_compute[187243]: 2025-12-03 00:20:09.960 187247 DEBUG nova.virt.libvirt.driver [None req-b469aa97-7aa5-4fb7-ae42-2a9353392f05 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 2e3ecd0e-4de1-44c9-805b-8d695da6b95e] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:20:11 compute-0 podman[220537]: 2025-12-03 00:20:11.106991746 +0000 UTC m=+0.064037654 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Dec 03 00:20:11 compute-0 podman[220538]: 2025-12-03 00:20:11.177818761 +0000 UTC m=+0.123587261 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 03 00:20:13 compute-0 nova_compute[187243]: 2025-12-03 00:20:13.277 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:14 compute-0 nova_compute[187243]: 2025-12-03 00:20:14.098 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:18 compute-0 nova_compute[187243]: 2025-12-03 00:20:18.279 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:19 compute-0 nova_compute[187243]: 2025-12-03 00:20:19.100 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:23 compute-0 nova_compute[187243]: 2025-12-03 00:20:23.280 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:24 compute-0 nova_compute[187243]: 2025-12-03 00:20:24.102 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:25 compute-0 podman[220585]: 2025-12-03 00:20:25.095662606 +0000 UTC m=+0.049862812 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 03 00:20:25 compute-0 podman[220586]: 2025-12-03 00:20:25.100948061 +0000 UTC m=+0.051827852 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc.)
Dec 03 00:20:25 compute-0 sshd-session[220583]: Invalid user valheim from 45.78.219.95 port 55194
Dec 03 00:20:25 compute-0 sshd-session[220583]: Received disconnect from 45.78.219.95 port 55194:11: Bye Bye [preauth]
Dec 03 00:20:25 compute-0 sshd-session[220583]: Disconnected from invalid user valheim 45.78.219.95 port 55194 [preauth]
Dec 03 00:20:28 compute-0 nova_compute[187243]: 2025-12-03 00:20:28.282 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:28 compute-0 sshd-session[220626]: Received disconnect from 101.47.140.127 port 49296:11: Bye Bye [preauth]
Dec 03 00:20:28 compute-0 sshd-session[220626]: Disconnected from authenticating user root 101.47.140.127 port 49296 [preauth]
Dec 03 00:20:28 compute-0 nova_compute[187243]: 2025-12-03 00:20:28.650 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:28 compute-0 nova_compute[187243]: 2025-12-03 00:20:28.650 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:29 compute-0 nova_compute[187243]: 2025-12-03 00:20:29.104 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:29 compute-0 nova_compute[187243]: 2025-12-03 00:20:29.157 187247 DEBUG nova.compute.manager [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:20:29 compute-0 nova_compute[187243]: 2025-12-03 00:20:29.715 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:29 compute-0 nova_compute[187243]: 2025-12-03 00:20:29.715 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:29 compute-0 nova_compute[187243]: 2025-12-03 00:20:29.721 187247 DEBUG nova.virt.hardware [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:20:29 compute-0 nova_compute[187243]: 2025-12-03 00:20:29.721 187247 INFO nova.compute.claims [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:20:29 compute-0 podman[197600]: time="2025-12-03T00:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:20:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:20:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Dec 03 00:20:30 compute-0 nova_compute[187243]: 2025-12-03 00:20:30.772 187247 DEBUG nova.compute.provider_tree [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:20:31 compute-0 nova_compute[187243]: 2025-12-03 00:20:31.280 187247 DEBUG nova.scheduler.client.report [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:20:31 compute-0 openstack_network_exporter[199746]: ERROR   00:20:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:20:31 compute-0 openstack_network_exporter[199746]: ERROR   00:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:20:31 compute-0 openstack_network_exporter[199746]: ERROR   00:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:20:31 compute-0 openstack_network_exporter[199746]: ERROR   00:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:20:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:20:31 compute-0 openstack_network_exporter[199746]: ERROR   00:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:20:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:20:31 compute-0 nova_compute[187243]: 2025-12-03 00:20:31.853 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.138s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:31 compute-0 nova_compute[187243]: 2025-12-03 00:20:31.854 187247 DEBUG nova.compute.manager [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:20:32 compute-0 nova_compute[187243]: 2025-12-03 00:20:32.364 187247 DEBUG nova.compute.manager [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:20:32 compute-0 nova_compute[187243]: 2025-12-03 00:20:32.364 187247 DEBUG nova.network.neutron [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:20:32 compute-0 nova_compute[187243]: 2025-12-03 00:20:32.364 187247 WARNING neutronclient.v2_0.client [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:32 compute-0 nova_compute[187243]: 2025-12-03 00:20:32.365 187247 WARNING neutronclient.v2_0.client [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:32 compute-0 nova_compute[187243]: 2025-12-03 00:20:32.870 187247 INFO nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:20:33 compute-0 nova_compute[187243]: 2025-12-03 00:20:33.017 187247 DEBUG nova.network.neutron [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Successfully created port: 96aba6d6-d4d8-494d-9070-4ad5c1609fdf _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:20:33 compute-0 nova_compute[187243]: 2025-12-03 00:20:33.283 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:33 compute-0 nova_compute[187243]: 2025-12-03 00:20:33.379 187247 DEBUG nova.compute.manager [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.107 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.292 187247 DEBUG nova.network.neutron [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Successfully updated port: 96aba6d6-d4d8-494d-9070-4ad5c1609fdf _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.340 187247 DEBUG nova.compute.manager [req-42d74039-477d-4711-9693-9ac4c85708d7 req-7d0c2641-704b-49bf-bb65-21f5b88d958f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-changed-96aba6d6-d4d8-494d-9070-4ad5c1609fdf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.340 187247 DEBUG nova.compute.manager [req-42d74039-477d-4711-9693-9ac4c85708d7 req-7d0c2641-704b-49bf-bb65-21f5b88d958f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Refreshing instance network info cache due to event network-changed-96aba6d6-d4d8-494d-9070-4ad5c1609fdf. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.341 187247 DEBUG oslo_concurrency.lockutils [req-42d74039-477d-4711-9693-9ac4c85708d7 req-7d0c2641-704b-49bf-bb65-21f5b88d958f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.341 187247 DEBUG oslo_concurrency.lockutils [req-42d74039-477d-4711-9693-9ac4c85708d7 req-7d0c2641-704b-49bf-bb65-21f5b88d958f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.341 187247 DEBUG nova.network.neutron [req-42d74039-477d-4711-9693-9ac4c85708d7 req-7d0c2641-704b-49bf-bb65-21f5b88d958f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Refreshing network info cache for port 96aba6d6-d4d8-494d-9070-4ad5c1609fdf _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.399 187247 DEBUG nova.compute.manager [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.400 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.401 187247 INFO nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Creating image(s)
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.401 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.402 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.402 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.404 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.408 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.409 187247 DEBUG oslo_concurrency.processutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.487 187247 DEBUG oslo_concurrency.processutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.488 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.489 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.489 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.492 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.492 187247 DEBUG oslo_concurrency.processutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.564 187247 DEBUG oslo_concurrency.processutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.565 187247 DEBUG oslo_concurrency.processutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.599 187247 DEBUG oslo_concurrency.processutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.600 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.600 187247 DEBUG oslo_concurrency.processutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.652 187247 DEBUG oslo_concurrency.processutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.653 187247 DEBUG nova.virt.disk.api [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Checking if we can resize image /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.653 187247 DEBUG oslo_concurrency.processutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.705 187247 DEBUG oslo_concurrency.processutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.707 187247 DEBUG nova.virt.disk.api [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Cannot resize image /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.707 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.708 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Ensure instance console log exists: /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.708 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.709 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.709 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.798 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:20:34 compute-0 nova_compute[187243]: 2025-12-03 00:20:34.846 187247 WARNING neutronclient.v2_0.client [req-42d74039-477d-4711-9693-9ac4c85708d7 req-7d0c2641-704b-49bf-bb65-21f5b88d958f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:35 compute-0 nova_compute[187243]: 2025-12-03 00:20:35.945 187247 DEBUG nova.network.neutron [req-42d74039-477d-4711-9693-9ac4c85708d7 req-7d0c2641-704b-49bf-bb65-21f5b88d958f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:20:37 compute-0 nova_compute[187243]: 2025-12-03 00:20:37.075 187247 DEBUG nova.network.neutron [req-42d74039-477d-4711-9693-9ac4c85708d7 req-7d0c2641-704b-49bf-bb65-21f5b88d958f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:20:37 compute-0 podman[220643]: 2025-12-03 00:20:37.085725614 +0000 UTC m=+0.046966872 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:20:37 compute-0 nova_compute[187243]: 2025-12-03 00:20:37.581 187247 DEBUG oslo_concurrency.lockutils [req-42d74039-477d-4711-9693-9ac4c85708d7 req-7d0c2641-704b-49bf-bb65-21f5b88d958f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:20:37 compute-0 nova_compute[187243]: 2025-12-03 00:20:37.582 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquired lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:20:37 compute-0 nova_compute[187243]: 2025-12-03 00:20:37.582 187247 DEBUG nova.network.neutron [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:20:38 compute-0 nova_compute[187243]: 2025-12-03 00:20:38.284 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:38 compute-0 nova_compute[187243]: 2025-12-03 00:20:38.799 187247 DEBUG nova.network.neutron [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:20:39 compute-0 nova_compute[187243]: 2025-12-03 00:20:39.104 187247 WARNING neutronclient.v2_0.client [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:39 compute-0 nova_compute[187243]: 2025-12-03 00:20:39.108 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.073 187247 DEBUG nova.network.neutron [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Updating instance_info_cache with network_info: [{"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.591 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Releasing lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.592 187247 DEBUG nova.compute.manager [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Instance network_info: |[{"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.594 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Start _get_guest_xml network_info=[{"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.597 187247 WARNING nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.598 187247 DEBUG nova.virt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1137830821', uuid='7643810a-7499-484f-80e2-2a0a33cafc55'), owner=OwnerMeta(userid='db24d5b25a924602ae8a7dc539bc6cbf', username='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin', projectid='e363b47741a1476ca7e5987b6d15acb5', projectname='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764721240.598898) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.603 187247 DEBUG nova.virt.libvirt.host [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.604 187247 DEBUG nova.virt.libvirt.host [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.607 187247 DEBUG nova.virt.libvirt.host [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.607 187247 DEBUG nova.virt.libvirt.host [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.608 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.609 187247 DEBUG nova.virt.hardware [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.609 187247 DEBUG nova.virt.hardware [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.609 187247 DEBUG nova.virt.hardware [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.610 187247 DEBUG nova.virt.hardware [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.610 187247 DEBUG nova.virt.hardware [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.610 187247 DEBUG nova.virt.hardware [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.610 187247 DEBUG nova.virt.hardware [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.610 187247 DEBUG nova.virt.hardware [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.611 187247 DEBUG nova.virt.hardware [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.611 187247 DEBUG nova.virt.hardware [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.611 187247 DEBUG nova.virt.hardware [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.616 187247 DEBUG nova.virt.libvirt.vif [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:20:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1137830821',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1137830',id=28,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-0dp6pegi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:20:33Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=7643810a-7499-484f-80e2-2a0a33cafc55,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.616 187247 DEBUG nova.network.os_vif_util [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converting VIF {"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.617 187247 DEBUG nova.network.os_vif_util [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:20:40 compute-0 nova_compute[187243]: 2025-12-03 00:20:40.617 187247 DEBUG nova.objects.instance [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7643810a-7499-484f-80e2-2a0a33cafc55 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.126 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:20:41 compute-0 nova_compute[187243]:   <uuid>7643810a-7499-484f-80e2-2a0a33cafc55</uuid>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   <name>instance-0000001c</name>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-1137830821</nova:name>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:20:40</nova:creationTime>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:20:41 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:20:41 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         <nova:port uuid="96aba6d6-d4d8-494d-9070-4ad5c1609fdf">
Dec 03 00:20:41 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <system>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <entry name="serial">7643810a-7499-484f-80e2-2a0a33cafc55</entry>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <entry name="uuid">7643810a-7499-484f-80e2-2a0a33cafc55</entry>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     </system>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   <os>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   </os>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   <features>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   </features>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk.config"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:9a:ff:2a"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <target dev="tap96aba6d6-d4"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/console.log" append="off"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <video>
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     </video>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:20:41 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:20:41 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:20:41 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:20:41 compute-0 nova_compute[187243]: </domain>
Dec 03 00:20:41 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.127 187247 DEBUG nova.compute.manager [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Preparing to wait for external event network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.128 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.128 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.128 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.129 187247 DEBUG nova.virt.libvirt.vif [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:20:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1137830821',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1137830',id=28,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-0dp6pegi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:20:33Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=7643810a-7499-484f-80e2-2a0a33cafc55,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.129 187247 DEBUG nova.network.os_vif_util [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converting VIF {"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.129 187247 DEBUG nova.network.os_vif_util [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.130 187247 DEBUG os_vif [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.130 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.130 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.131 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.131 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.131 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c47d244b-3b23-5c14-b03c-8369e8c6b298', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.187 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.189 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.191 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.191 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap96aba6d6-d4, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.191 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap96aba6d6-d4, col_values=(('qos', UUID('592ab62f-d146-4b22-9251-f32755f23e64')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.191 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap96aba6d6-d4, col_values=(('external_ids', {'iface-id': '96aba6d6-d4d8-494d-9070-4ad5c1609fdf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:ff:2a', 'vm-uuid': '7643810a-7499-484f-80e2-2a0a33cafc55'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.192 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:41 compute-0 NetworkManager[55671]: <info>  [1764721241.1936] manager: (tap96aba6d6-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.194 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.197 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:41 compute-0 nova_compute[187243]: 2025-12-03 00:20:41.198 187247 INFO os_vif [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4')
Dec 03 00:20:41 compute-0 sshd-session[220669]: Invalid user cc from 23.95.37.90 port 42168
Dec 03 00:20:41 compute-0 sshd-session[220669]: Received disconnect from 23.95.37.90 port 42168:11: Bye Bye [preauth]
Dec 03 00:20:41 compute-0 sshd-session[220669]: Disconnected from invalid user cc 23.95.37.90 port 42168 [preauth]
Dec 03 00:20:41 compute-0 podman[220671]: 2025-12-03 00:20:41.66346474 +0000 UTC m=+0.053688637 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:20:41 compute-0 podman[220672]: 2025-12-03 00:20:41.687709933 +0000 UTC m=+0.077318095 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.4)
Dec 03 00:20:42 compute-0 nova_compute[187243]: 2025-12-03 00:20:42.748 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:20:42 compute-0 nova_compute[187243]: 2025-12-03 00:20:42.748 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:20:42 compute-0 nova_compute[187243]: 2025-12-03 00:20:42.748 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] No VIF found with MAC fa:16:3e:9a:ff:2a, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:20:42 compute-0 nova_compute[187243]: 2025-12-03 00:20:42.749 187247 INFO nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Using config drive
Dec 03 00:20:43 compute-0 nova_compute[187243]: 2025-12-03 00:20:43.266 187247 WARNING neutronclient.v2_0.client [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.023 187247 INFO nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Creating config drive at /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk.config
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.029 187247 DEBUG oslo_concurrency.processutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpjuw9zdkw execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.109 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.153 187247 DEBUG oslo_concurrency.processutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpjuw9zdkw" returned: 0 in 0.125s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:44 compute-0 kernel: tap96aba6d6-d4: entered promiscuous mode
Dec 03 00:20:44 compute-0 NetworkManager[55671]: <info>  [1764721244.2064] manager: (tap96aba6d6-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Dec 03 00:20:44 compute-0 ovn_controller[95488]: 2025-12-03T00:20:44Z|00208|binding|INFO|Claiming lport 96aba6d6-d4d8-494d-9070-4ad5c1609fdf for this chassis.
Dec 03 00:20:44 compute-0 ovn_controller[95488]: 2025-12-03T00:20:44Z|00209|binding|INFO|96aba6d6-d4d8-494d-9070-4ad5c1609fdf: Claiming fa:16:3e:9a:ff:2a 10.100.0.8
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.209 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.220 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:ff:2a 10.100.0.8'], port_security=['fa:16:3e:9a:ff:2a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7643810a-7499-484f-80e2-2a0a33cafc55', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=96aba6d6-d4d8-494d-9070-4ad5c1609fdf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.221 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 96aba6d6-d4d8-494d-9070-4ad5c1609fdf in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 bound to our chassis
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.222 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:20:44 compute-0 ovn_controller[95488]: 2025-12-03T00:20:44Z|00210|binding|INFO|Setting lport 96aba6d6-d4d8-494d-9070-4ad5c1609fdf ovn-installed in OVS
Dec 03 00:20:44 compute-0 ovn_controller[95488]: 2025-12-03T00:20:44Z|00211|binding|INFO|Setting lport 96aba6d6-d4d8-494d-9070-4ad5c1609fdf up in Southbound
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.225 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:44 compute-0 systemd-udevd[220734]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.233 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[11fbc08e-5db4-484f-aaaf-53d6f59231cf]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.234 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf7ff943d-e1 in ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.236 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf7ff943d-e0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.236 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c80c593b-08bd-4c70-9f7f-7d17e718ccb2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.237 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e22eb029-a1f8-4c59-9e91-b046e6079ecb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 NetworkManager[55671]: <info>  [1764721244.2482] device (tap96aba6d6-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:20:44 compute-0 NetworkManager[55671]: <info>  [1764721244.2489] device (tap96aba6d6-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.251 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[b5c11cfb-b4c4-40f5-9f43-fc3de316ac7f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 systemd-machined[153518]: New machine qemu-19-instance-0000001c.
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.268 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6f4c8534-5885-48cd-9cee-f13aeb878974]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-0000001c.
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.299 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[2535f9f0-b7cc-4de4-8ad0-681f4ed8c9c5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.303 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[53d81fce-82b5-45cf-9a15-b5fabdf87f9b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 NetworkManager[55671]: <info>  [1764721244.3047] manager: (tapf7ff943d-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/84)
Dec 03 00:20:44 compute-0 systemd-udevd[220739]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.337 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[0807bbff-08ea-45a5-8986-56a20a45af19]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.339 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0093f6-0809-45ab-9367-1ab53d5fff27]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 NetworkManager[55671]: <info>  [1764721244.3604] device (tapf7ff943d-e0): carrier: link connected
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.367 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[1d793be2-51b2-4a85-8485-6639f3d21f40]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.376 187247 DEBUG nova.compute.manager [req-ab105013-8cba-40a0-83fe-a4b5ead819b5 req-672b4541-a07c-43ac-8bdc-60a53d15d4e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.377 187247 DEBUG oslo_concurrency.lockutils [req-ab105013-8cba-40a0-83fe-a4b5ead819b5 req-672b4541-a07c-43ac-8bdc-60a53d15d4e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.377 187247 DEBUG oslo_concurrency.lockutils [req-ab105013-8cba-40a0-83fe-a4b5ead819b5 req-672b4541-a07c-43ac-8bdc-60a53d15d4e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.377 187247 DEBUG oslo_concurrency.lockutils [req-ab105013-8cba-40a0-83fe-a4b5ead819b5 req-672b4541-a07c-43ac-8bdc-60a53d15d4e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.377 187247 DEBUG nova.compute.manager [req-ab105013-8cba-40a0-83fe-a4b5ead819b5 req-672b4541-a07c-43ac-8bdc-60a53d15d4e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Processing event network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.384 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2bac355b-04eb-4ec4-bdd1-b43670c84496]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533500, 'reachable_time': 27608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220768, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.399 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0cee7f5e-6c93-4c10-a004-92730f24eac0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:9625'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533500, 'tstamp': 533500}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220769, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.419 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[306cfcc2-bddc-43f5-92aa-85bf6f495f38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533500, 'reachable_time': 27608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220770, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.452 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[66bfdb30-f553-4880-8874-6871601d2e23]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.512 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[cf254832-6c88-4d45-8ac7-2731e94eb1ec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.513 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.514 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.514 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ff943d-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:44 compute-0 kernel: tapf7ff943d-e0: entered promiscuous mode
Dec 03 00:20:44 compute-0 NetworkManager[55671]: <info>  [1764721244.5163] manager: (tapf7ff943d-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.515 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.518 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7ff943d-e0, col_values=(('external_ids', {'iface-id': '636cd919-869d-4a8a-92fa-ec7c18804da5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:44 compute-0 ovn_controller[95488]: 2025-12-03T00:20:44Z|00212|binding|INFO|Releasing lport 636cd919-869d-4a8a-92fa-ec7c18804da5 from this chassis (sb_readonly=0)
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.519 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.532 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.533 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6a6d44b6-0ca2-4671-a0c4-1a561b5c1354]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.534 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.534 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.534 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.534 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.534 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[34d8937a-95c9-470a-aa90-7893dc45ca22]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.535 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.535 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8fe753ab-ce26-4d4b-9b22-1160f1f24b5b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.536 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: global
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: defaults
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     log global
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.536 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'env', 'PROCESS_TAG=haproxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:20:44 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:44.808 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:20:44 compute-0 nova_compute[187243]: 2025-12-03 00:20:44.896 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:45 compute-0 podman[220799]: 2025-12-03 00:20:44.937407074 +0000 UTC m=+0.020613396 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:20:45 compute-0 nova_compute[187243]: 2025-12-03 00:20:45.340 187247 DEBUG nova.compute.manager [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:20:45 compute-0 nova_compute[187243]: 2025-12-03 00:20:45.344 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:20:45 compute-0 nova_compute[187243]: 2025-12-03 00:20:45.346 187247 INFO nova.virt.libvirt.driver [-] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Instance spawned successfully.
Dec 03 00:20:45 compute-0 nova_compute[187243]: 2025-12-03 00:20:45.347 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:20:45 compute-0 podman[220799]: 2025-12-03 00:20:45.548269179 +0000 UTC m=+0.631475481 container create 1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Dec 03 00:20:45 compute-0 systemd[1]: Started libpod-conmon-1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4.scope.
Dec 03 00:20:45 compute-0 systemd[1]: Started libcrun container.
Dec 03 00:20:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd186f06f51cbd737a3dcb973cb3a14faf33a33b0390bc2cc9298553b805d811/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:20:45 compute-0 podman[220799]: 2025-12-03 00:20:45.640388336 +0000 UTC m=+0.723594628 container init 1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:20:45 compute-0 podman[220799]: 2025-12-03 00:20:45.648136836 +0000 UTC m=+0.731343128 container start 1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Dec 03 00:20:45 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220821]: [NOTICE]   (220825) : New worker (220827) forked
Dec 03 00:20:45 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220821]: [NOTICE]   (220825) : Loading success.
Dec 03 00:20:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:45.728 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:20:45 compute-0 nova_compute[187243]: 2025-12-03 00:20:45.864 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:20:45 compute-0 nova_compute[187243]: 2025-12-03 00:20:45.864 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:20:45 compute-0 nova_compute[187243]: 2025-12-03 00:20:45.865 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:20:45 compute-0 nova_compute[187243]: 2025-12-03 00:20:45.865 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:20:45 compute-0 nova_compute[187243]: 2025-12-03 00:20:45.866 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:20:45 compute-0 nova_compute[187243]: 2025-12-03 00:20:45.867 187247 DEBUG nova.virt.libvirt.driver [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:20:46 compute-0 nova_compute[187243]: 2025-12-03 00:20:46.192 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:46 compute-0 nova_compute[187243]: 2025-12-03 00:20:46.376 187247 INFO nova.compute.manager [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Took 11.98 seconds to spawn the instance on the hypervisor.
Dec 03 00:20:46 compute-0 nova_compute[187243]: 2025-12-03 00:20:46.376 187247 DEBUG nova.compute.manager [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:20:46 compute-0 nova_compute[187243]: 2025-12-03 00:20:46.445 187247 DEBUG nova.compute.manager [req-c548512b-ed06-42e7-b957-7e64561bda4e req-3524449c-c904-4fc0-9e6f-06b2d6d7d827 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:20:46 compute-0 nova_compute[187243]: 2025-12-03 00:20:46.446 187247 DEBUG oslo_concurrency.lockutils [req-c548512b-ed06-42e7-b957-7e64561bda4e req-3524449c-c904-4fc0-9e6f-06b2d6d7d827 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:46 compute-0 nova_compute[187243]: 2025-12-03 00:20:46.446 187247 DEBUG oslo_concurrency.lockutils [req-c548512b-ed06-42e7-b957-7e64561bda4e req-3524449c-c904-4fc0-9e6f-06b2d6d7d827 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:46 compute-0 nova_compute[187243]: 2025-12-03 00:20:46.446 187247 DEBUG oslo_concurrency.lockutils [req-c548512b-ed06-42e7-b957-7e64561bda4e req-3524449c-c904-4fc0-9e6f-06b2d6d7d827 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:46 compute-0 nova_compute[187243]: 2025-12-03 00:20:46.446 187247 DEBUG nova.compute.manager [req-c548512b-ed06-42e7-b957-7e64561bda4e req-3524449c-c904-4fc0-9e6f-06b2d6d7d827 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] No waiting events found dispatching network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:20:46 compute-0 nova_compute[187243]: 2025-12-03 00:20:46.447 187247 WARNING nova.compute.manager [req-c548512b-ed06-42e7-b957-7e64561bda4e req-3524449c-c904-4fc0-9e6f-06b2d6d7d827 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received unexpected event network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf for instance with vm_state building and task_state spawning.
Dec 03 00:20:46 compute-0 nova_compute[187243]: 2025-12-03 00:20:46.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:20:46 compute-0 nova_compute[187243]: 2025-12-03 00:20:46.927 187247 INFO nova.compute.manager [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Took 17.26 seconds to build instance.
Dec 03 00:20:47 compute-0 nova_compute[187243]: 2025-12-03 00:20:47.431 187247 DEBUG oslo_concurrency.lockutils [None req-5c73b938-1833-434b-9efa-67f49fc63402 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.781s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:48 compute-0 nova_compute[187243]: 2025-12-03 00:20:48.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:20:49 compute-0 nova_compute[187243]: 2025-12-03 00:20:49.111 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:49 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:20:49.950 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:49 compute-0 nova_compute[187243]: 2025-12-03 00:20:49.956 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:49 compute-0 nova_compute[187243]: 2025-12-03 00:20:49.956 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:50 compute-0 sshd-session[220837]: Received disconnect from 20.123.120.169 port 37924:11: Bye Bye [preauth]
Dec 03 00:20:50 compute-0 sshd-session[220837]: Disconnected from authenticating user root 20.123.120.169 port 37924 [preauth]
Dec 03 00:20:50 compute-0 nova_compute[187243]: 2025-12-03 00:20:50.463 187247 DEBUG nova.compute.manager [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:20:51 compute-0 nova_compute[187243]: 2025-12-03 00:20:51.033 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:51 compute-0 nova_compute[187243]: 2025-12-03 00:20:51.033 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:51 compute-0 nova_compute[187243]: 2025-12-03 00:20:51.041 187247 DEBUG nova.virt.hardware [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:20:51 compute-0 nova_compute[187243]: 2025-12-03 00:20:51.041 187247 INFO nova.compute.claims [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:20:51 compute-0 nova_compute[187243]: 2025-12-03 00:20:51.195 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:52 compute-0 nova_compute[187243]: 2025-12-03 00:20:52.221 187247 DEBUG nova.compute.provider_tree [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:20:52 compute-0 nova_compute[187243]: 2025-12-03 00:20:52.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:20:52 compute-0 nova_compute[187243]: 2025-12-03 00:20:52.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:20:52 compute-0 nova_compute[187243]: 2025-12-03 00:20:52.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:20:52 compute-0 nova_compute[187243]: 2025-12-03 00:20:52.825 187247 DEBUG nova.scheduler.client.report [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:20:53 compute-0 nova_compute[187243]: 2025-12-03 00:20:53.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:53 compute-0 nova_compute[187243]: 2025-12-03 00:20:53.334 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.301s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:53 compute-0 nova_compute[187243]: 2025-12-03 00:20:53.335 187247 DEBUG nova.compute.manager [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:20:53 compute-0 nova_compute[187243]: 2025-12-03 00:20:53.337 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.232s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:53 compute-0 nova_compute[187243]: 2025-12-03 00:20:53.337 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:53 compute-0 nova_compute[187243]: 2025-12-03 00:20:53.337 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:20:53 compute-0 nova_compute[187243]: 2025-12-03 00:20:53.848 187247 DEBUG nova.compute.manager [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:20:53 compute-0 nova_compute[187243]: 2025-12-03 00:20:53.849 187247 DEBUG nova.network.neutron [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:20:53 compute-0 nova_compute[187243]: 2025-12-03 00:20:53.849 187247 WARNING neutronclient.v2_0.client [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:53 compute-0 nova_compute[187243]: 2025-12-03 00:20:53.849 187247 WARNING neutronclient.v2_0.client [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.112 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.356 187247 INFO nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.374 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.428 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.429 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.447 187247 DEBUG nova.network.neutron [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Successfully created port: bd61e9e8-f7f0-458d-858f-ffb409383310 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.480 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.614 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.615 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.632 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.633 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5664MB free_disk=73.16147232055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.633 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.633 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:54 compute-0 nova_compute[187243]: 2025-12-03 00:20:54.864 187247 DEBUG nova.compute.manager [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.112 187247 DEBUG nova.network.neutron [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Successfully updated port: bd61e9e8-f7f0-458d-858f-ffb409383310 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.155 187247 DEBUG nova.compute.manager [req-040edd75-0add-4b19-a34e-a575daf57e2e req-c97ba58d-da13-4d0e-abc4-77fd89a783e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-changed-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.156 187247 DEBUG nova.compute.manager [req-040edd75-0add-4b19-a34e-a575daf57e2e req-c97ba58d-da13-4d0e-abc4-77fd89a783e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Refreshing instance network info cache due to event network-changed-bd61e9e8-f7f0-458d-858f-ffb409383310. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.156 187247 DEBUG oslo_concurrency.lockutils [req-040edd75-0add-4b19-a34e-a575daf57e2e req-c97ba58d-da13-4d0e-abc4-77fd89a783e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.157 187247 DEBUG oslo_concurrency.lockutils [req-040edd75-0add-4b19-a34e-a575daf57e2e req-c97ba58d-da13-4d0e-abc4-77fd89a783e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.157 187247 DEBUG nova.network.neutron [req-040edd75-0add-4b19-a34e-a575daf57e2e req-c97ba58d-da13-4d0e-abc4-77fd89a783e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Refreshing network info cache for port bd61e9e8-f7f0-458d-858f-ffb409383310 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.619 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.661 187247 WARNING neutronclient.v2_0.client [req-040edd75-0add-4b19-a34e-a575daf57e2e req-c97ba58d-da13-4d0e-abc4-77fd89a783e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.679 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance 7643810a-7499-484f-80e2-2a0a33cafc55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.679 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance bfaf8926-00b3-46a4-b85f-46ee074d049e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.680 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.680 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:20:54 up  1:29,  0 user,  load average: 0.41, 0.32, 0.29\n', 'num_instances': '2', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_e363b47741a1476ca7e5987b6d15acb5': '2', 'io_workload': '1', 'num_vm_building': '1', 'num_task_networking': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.729 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.734 187247 DEBUG nova.network.neutron [req-040edd75-0add-4b19-a34e-a575daf57e2e req-c97ba58d-da13-4d0e-abc4-77fd89a783e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.882 187247 DEBUG nova.compute.manager [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.884 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.884 187247 INFO nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Creating image(s)
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.885 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.885 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.886 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.886 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.890 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.891 187247 DEBUG oslo_concurrency.processutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.905 187247 DEBUG nova.network.neutron [req-040edd75-0add-4b19-a34e-a575daf57e2e req-c97ba58d-da13-4d0e-abc4-77fd89a783e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.954 187247 DEBUG oslo_concurrency.processutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.955 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.956 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.957 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.960 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:20:55 compute-0 nova_compute[187243]: 2025-12-03 00:20:55.961 187247 DEBUG oslo_concurrency.processutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.019 187247 DEBUG oslo_concurrency.processutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.020 187247 DEBUG oslo_concurrency.processutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.054 187247 DEBUG oslo_concurrency.processutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.055 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.055 187247 DEBUG oslo_concurrency.processutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:56 compute-0 podman[220854]: 2025-12-03 00:20:56.114742604 +0000 UTC m=+0.065734512 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.116 187247 DEBUG oslo_concurrency.processutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.117 187247 DEBUG nova.virt.disk.api [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Checking if we can resize image /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:20:56 compute-0 podman[220856]: 2025-12-03 00:20:56.117400399 +0000 UTC m=+0.068231703 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter)
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.117 187247 DEBUG oslo_concurrency.processutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.173 187247 DEBUG oslo_concurrency.processutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.174 187247 DEBUG nova.virt.disk.api [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Cannot resize image /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.174 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.175 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Ensure instance console log exists: /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.175 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.175 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.176 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.197 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.237 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.412 187247 DEBUG oslo_concurrency.lockutils [req-040edd75-0add-4b19-a34e-a575daf57e2e req-c97ba58d-da13-4d0e-abc4-77fd89a783e6 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.413 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquired lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.413 187247 DEBUG nova.network.neutron [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.746 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:20:56 compute-0 nova_compute[187243]: 2025-12-03 00:20:56.746 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:57 compute-0 nova_compute[187243]: 2025-12-03 00:20:57.129 187247 DEBUG nova.network.neutron [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:20:57 compute-0 nova_compute[187243]: 2025-12-03 00:20:57.379 187247 WARNING neutronclient.v2_0.client [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:20:57 compute-0 nova_compute[187243]: 2025-12-03 00:20:57.553 187247 DEBUG nova.network.neutron [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Updating instance_info_cache with network_info: [{"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.127 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Releasing lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.128 187247 DEBUG nova.compute.manager [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Instance network_info: |[{"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.130 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Start _get_guest_xml network_info=[{"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.133 187247 WARNING nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.134 187247 DEBUG nova.virt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadStabilizationStrategy-server-350860971', uuid='bfaf8926-00b3-46a4-b85f-46ee074d049e'), owner=OwnerMeta(userid='db24d5b25a924602ae8a7dc539bc6cbf', username='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin', projectid='e363b47741a1476ca7e5987b6d15acb5', projectname='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764721258.1348295) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.139 187247 DEBUG nova.virt.libvirt.host [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.139 187247 DEBUG nova.virt.libvirt.host [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.141 187247 DEBUG nova.virt.libvirt.host [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.142 187247 DEBUG nova.virt.libvirt.host [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.143 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.143 187247 DEBUG nova.virt.hardware [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.143 187247 DEBUG nova.virt.hardware [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.144 187247 DEBUG nova.virt.hardware [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.144 187247 DEBUG nova.virt.hardware [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.144 187247 DEBUG nova.virt.hardware [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.144 187247 DEBUG nova.virt.hardware [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.144 187247 DEBUG nova.virt.hardware [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.145 187247 DEBUG nova.virt.hardware [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.145 187247 DEBUG nova.virt.hardware [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.145 187247 DEBUG nova.virt.hardware [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.145 187247 DEBUG nova.virt.hardware [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.149 187247 DEBUG nova.virt.libvirt.vif [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:20:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-350860971',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-3508609',id=29,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-7vu1e8y1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:20:54Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=bfaf8926-00b3-46a4-b85f-46ee074d049e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.149 187247 DEBUG nova.network.os_vif_util [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converting VIF {"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.150 187247 DEBUG nova.network.os_vif_util [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.151 187247 DEBUG nova.objects.instance [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lazy-loading 'pci_devices' on Instance uuid bfaf8926-00b3-46a4-b85f-46ee074d049e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.784 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:20:58 compute-0 nova_compute[187243]:   <uuid>bfaf8926-00b3-46a4-b85f-46ee074d049e</uuid>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   <name>instance-0000001d</name>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-350860971</nova:name>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:20:58</nova:creationTime>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:20:58 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:20:58 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         <nova:port uuid="bd61e9e8-f7f0-458d-858f-ffb409383310">
Dec 03 00:20:58 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <system>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <entry name="serial">bfaf8926-00b3-46a4-b85f-46ee074d049e</entry>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <entry name="uuid">bfaf8926-00b3-46a4-b85f-46ee074d049e</entry>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     </system>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   <os>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   </os>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   <features>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   </features>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk.config"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:03:5f:29"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <target dev="tapbd61e9e8-f7"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/console.log" append="off"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <video>
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     </video>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:20:58 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:20:58 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:20:58 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:20:58 compute-0 nova_compute[187243]: </domain>
Dec 03 00:20:58 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.786 187247 DEBUG nova.compute.manager [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Preparing to wait for external event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.786 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.786 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.787 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.787 187247 DEBUG nova.virt.libvirt.vif [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:20:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-350860971',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-3508609',id=29,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-7vu1e8y1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:20:54Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=bfaf8926-00b3-46a4-b85f-46ee074d049e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.788 187247 DEBUG nova.network.os_vif_util [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converting VIF {"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.788 187247 DEBUG nova.network.os_vif_util [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.789 187247 DEBUG os_vif [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.789 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.790 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.790 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.791 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.792 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1dbe0aa4-c204-536d-925e-647a657a5e0f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.793 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.794 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.797 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.797 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd61e9e8-f7, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.798 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapbd61e9e8-f7, col_values=(('qos', UUID('0cc557a1-81a1-4dca-82fe-4d9ed316430b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.798 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapbd61e9e8-f7, col_values=(('external_ids', {'iface-id': 'bd61e9e8-f7f0-458d-858f-ffb409383310', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:5f:29', 'vm-uuid': 'bfaf8926-00b3-46a4-b85f-46ee074d049e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.799 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:58 compute-0 NetworkManager[55671]: <info>  [1764721258.8007] manager: (tapbd61e9e8-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.802 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.806 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:58 compute-0 nova_compute[187243]: 2025-12-03 00:20:58.807 187247 INFO os_vif [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7')
Dec 03 00:20:59 compute-0 ovn_controller[95488]: 2025-12-03T00:20:59Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9a:ff:2a 10.100.0.8
Dec 03 00:20:59 compute-0 ovn_controller[95488]: 2025-12-03T00:20:59Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9a:ff:2a 10.100.0.8
Dec 03 00:20:59 compute-0 nova_compute[187243]: 2025-12-03 00:20:59.114 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:20:59 compute-0 sshd-session[220907]: Received disconnect from 45.78.222.160 port 38404:11: Bye Bye [preauth]
Dec 03 00:20:59 compute-0 sshd-session[220907]: Disconnected from authenticating user root 45.78.222.160 port 38404 [preauth]
Dec 03 00:20:59 compute-0 podman[197600]: time="2025-12-03T00:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:20:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:20:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3065 "" "Go-http-client/1.1"
Dec 03 00:21:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:00.721 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:00.722 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:00.722 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:00 compute-0 nova_compute[187243]: 2025-12-03 00:21:00.746 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:00 compute-0 nova_compute[187243]: 2025-12-03 00:21:00.747 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:00 compute-0 nova_compute[187243]: 2025-12-03 00:21:00.747 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:01 compute-0 nova_compute[187243]: 2025-12-03 00:21:01.271 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:21:01 compute-0 nova_compute[187243]: 2025-12-03 00:21:01.272 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:21:01 compute-0 nova_compute[187243]: 2025-12-03 00:21:01.272 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] No VIF found with MAC fa:16:3e:03:5f:29, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:21:01 compute-0 nova_compute[187243]: 2025-12-03 00:21:01.273 187247 INFO nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Using config drive
Dec 03 00:21:01 compute-0 openstack_network_exporter[199746]: ERROR   00:21:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:21:01 compute-0 openstack_network_exporter[199746]: ERROR   00:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:21:01 compute-0 openstack_network_exporter[199746]: ERROR   00:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:21:01 compute-0 openstack_network_exporter[199746]: ERROR   00:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:21:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:21:01 compute-0 openstack_network_exporter[199746]: ERROR   00:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:21:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:21:02 compute-0 nova_compute[187243]: 2025-12-03 00:21:02.053 187247 WARNING neutronclient.v2_0.client [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:02 compute-0 nova_compute[187243]: 2025-12-03 00:21:02.506 187247 INFO nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Creating config drive at /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk.config
Dec 03 00:21:02 compute-0 nova_compute[187243]: 2025-12-03 00:21:02.512 187247 DEBUG oslo_concurrency.processutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpfv8qerx6 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:02 compute-0 nova_compute[187243]: 2025-12-03 00:21:02.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:02 compute-0 nova_compute[187243]: 2025-12-03 00:21:02.638 187247 DEBUG oslo_concurrency.processutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpfv8qerx6" returned: 0 in 0.127s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:02 compute-0 NetworkManager[55671]: <info>  [1764721262.6894] manager: (tapbd61e9e8-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Dec 03 00:21:02 compute-0 kernel: tapbd61e9e8-f7: entered promiscuous mode
Dec 03 00:21:02 compute-0 ovn_controller[95488]: 2025-12-03T00:21:02Z|00213|binding|INFO|Claiming lport bd61e9e8-f7f0-458d-858f-ffb409383310 for this chassis.
Dec 03 00:21:02 compute-0 ovn_controller[95488]: 2025-12-03T00:21:02Z|00214|binding|INFO|bd61e9e8-f7f0-458d-858f-ffb409383310: Claiming fa:16:3e:03:5f:29 10.100.0.13
Dec 03 00:21:02 compute-0 nova_compute[187243]: 2025-12-03 00:21:02.692 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:02 compute-0 ovn_controller[95488]: 2025-12-03T00:21:02Z|00215|binding|INFO|Setting lport bd61e9e8-f7f0-458d-858f-ffb409383310 ovn-installed in OVS
Dec 03 00:21:02 compute-0 nova_compute[187243]: 2025-12-03 00:21:02.707 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:02 compute-0 nova_compute[187243]: 2025-12-03 00:21:02.712 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:02 compute-0 systemd-udevd[220937]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:21:02 compute-0 systemd-machined[153518]: New machine qemu-20-instance-0000001d.
Dec 03 00:21:02 compute-0 NetworkManager[55671]: <info>  [1764721262.7410] device (tapbd61e9e8-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:21:02 compute-0 NetworkManager[55671]: <info>  [1764721262.7417] device (tapbd61e9e8-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:21:02 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000001d.
Dec 03 00:21:03 compute-0 ovn_controller[95488]: 2025-12-03T00:21:03Z|00216|binding|INFO|Setting lport bd61e9e8-f7f0-458d-858f-ffb409383310 up in Southbound
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.283 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:5f:29 10.100.0.13'], port_security=['fa:16:3e:03:5f:29 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'bfaf8926-00b3-46a4-b85f-46ee074d049e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=bd61e9e8-f7f0-458d-858f-ffb409383310) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.285 104379 INFO neutron.agent.ovn.metadata.agent [-] Port bd61e9e8-f7f0-458d-858f-ffb409383310 in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 bound to our chassis
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.286 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.302 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[371273fd-b04c-4d9e-a10a-b18847c00136]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.330 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[aa2836ba-4928-4ec9-bf00-44eb98fab1ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.333 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[b7971a53-c461-4a8b-95bc-08bd16411a73]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.361 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[b9659b35-e5f8-4fb3-8426-b63b71d0ec5e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.375 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[978cfff3-02e9-4144-97c9-28989b1a2df6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533500, 'reachable_time': 27608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220959, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.391 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[38eac4ce-af0c-4f62-a994-2c6debd75bd9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533512, 'tstamp': 533512}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220960, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533515, 'tstamp': 533515}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220960, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.392 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:03 compute-0 nova_compute[187243]: 2025-12-03 00:21:03.393 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:03 compute-0 nova_compute[187243]: 2025-12-03 00:21:03.395 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.395 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ff943d-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.395 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.396 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7ff943d-e0, col_values=(('external_ids', {'iface-id': '636cd919-869d-4a8a-92fa-ec7c18804da5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.396 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:21:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:03.397 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[cd08487f-7f0d-49f3-b688-d95d2eb9f52c]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:03 compute-0 nova_compute[187243]: 2025-12-03 00:21:03.800 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.161 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.348 187247 DEBUG nova.compute.manager [req-15d5b7ee-5d31-4dac-a007-4093db9767c1 req-bff81311-5a3d-49f7-b554-40269075e009 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.348 187247 DEBUG oslo_concurrency.lockutils [req-15d5b7ee-5d31-4dac-a007-4093db9767c1 req-bff81311-5a3d-49f7-b554-40269075e009 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.349 187247 DEBUG oslo_concurrency.lockutils [req-15d5b7ee-5d31-4dac-a007-4093db9767c1 req-bff81311-5a3d-49f7-b554-40269075e009 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.349 187247 DEBUG oslo_concurrency.lockutils [req-15d5b7ee-5d31-4dac-a007-4093db9767c1 req-bff81311-5a3d-49f7-b554-40269075e009 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.349 187247 DEBUG nova.compute.manager [req-15d5b7ee-5d31-4dac-a007-4093db9767c1 req-bff81311-5a3d-49f7-b554-40269075e009 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Processing event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.349 187247 DEBUG nova.compute.manager [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.353 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.355 187247 INFO nova.virt.libvirt.driver [-] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Instance spawned successfully.
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.356 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.870 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.871 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.872 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.872 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.873 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:21:04 compute-0 nova_compute[187243]: 2025-12-03 00:21:04.873 187247 DEBUG nova.virt.libvirt.driver [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:21:05 compute-0 nova_compute[187243]: 2025-12-03 00:21:05.489 187247 INFO nova.compute.manager [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Took 9.61 seconds to spawn the instance on the hypervisor.
Dec 03 00:21:05 compute-0 nova_compute[187243]: 2025-12-03 00:21:05.490 187247 DEBUG nova.compute.manager [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:21:06 compute-0 nova_compute[187243]: 2025-12-03 00:21:06.069 187247 INFO nova.compute.manager [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Took 15.09 seconds to build instance.
Dec 03 00:21:06 compute-0 nova_compute[187243]: 2025-12-03 00:21:06.416 187247 DEBUG nova.compute.manager [req-750f0277-afab-4a6d-bd46-c3c51c80b0dc req-052ac73a-e7ed-48ce-ad38-f87d3b246403 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:21:06 compute-0 nova_compute[187243]: 2025-12-03 00:21:06.416 187247 DEBUG oslo_concurrency.lockutils [req-750f0277-afab-4a6d-bd46-c3c51c80b0dc req-052ac73a-e7ed-48ce-ad38-f87d3b246403 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:06 compute-0 nova_compute[187243]: 2025-12-03 00:21:06.416 187247 DEBUG oslo_concurrency.lockutils [req-750f0277-afab-4a6d-bd46-c3c51c80b0dc req-052ac73a-e7ed-48ce-ad38-f87d3b246403 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:06 compute-0 nova_compute[187243]: 2025-12-03 00:21:06.417 187247 DEBUG oslo_concurrency.lockutils [req-750f0277-afab-4a6d-bd46-c3c51c80b0dc req-052ac73a-e7ed-48ce-ad38-f87d3b246403 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:06 compute-0 nova_compute[187243]: 2025-12-03 00:21:06.417 187247 DEBUG nova.compute.manager [req-750f0277-afab-4a6d-bd46-c3c51c80b0dc req-052ac73a-e7ed-48ce-ad38-f87d3b246403 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] No waiting events found dispatching network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:21:06 compute-0 nova_compute[187243]: 2025-12-03 00:21:06.417 187247 WARNING nova.compute.manager [req-750f0277-afab-4a6d-bd46-c3c51c80b0dc req-052ac73a-e7ed-48ce-ad38-f87d3b246403 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received unexpected event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 for instance with vm_state active and task_state None.
Dec 03 00:21:06 compute-0 nova_compute[187243]: 2025-12-03 00:21:06.574 187247 DEBUG oslo_concurrency.lockutils [None req-42ac9136-a226-4d8a-9c2a-096f69782090 db24d5b25a924602ae8a7dc539bc6cbf e363b47741a1476ca7e5987b6d15acb5 - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.618s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:08 compute-0 podman[220961]: 2025-12-03 00:21:08.108725288 +0000 UTC m=+0.056672130 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:21:08 compute-0 nova_compute[187243]: 2025-12-03 00:21:08.802 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:09 compute-0 nova_compute[187243]: 2025-12-03 00:21:09.163 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:10 compute-0 nova_compute[187243]: 2025-12-03 00:21:10.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:12 compute-0 podman[220986]: 2025-12-03 00:21:12.134410558 +0000 UTC m=+0.085110536 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Dec 03 00:21:12 compute-0 podman[220987]: 2025-12-03 00:21:12.20509004 +0000 UTC m=+0.133606924 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:21:13 compute-0 sshd-session[221002]: Received disconnect from 102.210.148.92 port 49062:11: Bye Bye [preauth]
Dec 03 00:21:13 compute-0 sshd-session[221002]: Disconnected from authenticating user root 102.210.148.92 port 49062 [preauth]
Dec 03 00:21:13 compute-0 nova_compute[187243]: 2025-12-03 00:21:13.804 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:14 compute-0 nova_compute[187243]: 2025-12-03 00:21:14.165 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:15 compute-0 sshd-session[220582]: Connection closed by 45.78.219.213 port 49420 [preauth]
Dec 03 00:21:17 compute-0 ovn_controller[95488]: 2025-12-03T00:21:17Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:03:5f:29 10.100.0.13
Dec 03 00:21:17 compute-0 ovn_controller[95488]: 2025-12-03T00:21:17Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:03:5f:29 10.100.0.13
Dec 03 00:21:18 compute-0 nova_compute[187243]: 2025-12-03 00:21:18.806 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:19 compute-0 nova_compute[187243]: 2025-12-03 00:21:19.167 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:23 compute-0 nova_compute[187243]: 2025-12-03 00:21:23.808 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:24 compute-0 nova_compute[187243]: 2025-12-03 00:21:24.224 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:27 compute-0 podman[221044]: 2025-12-03 00:21:27.098622569 +0000 UTC m=+0.055922491 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 03 00:21:27 compute-0 podman[221043]: 2025-12-03 00:21:27.09947861 +0000 UTC m=+0.057596762 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251202)
Dec 03 00:21:28 compute-0 nova_compute[187243]: 2025-12-03 00:21:28.810 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:29 compute-0 nova_compute[187243]: 2025-12-03 00:21:29.225 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:29 compute-0 podman[197600]: time="2025-12-03T00:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:21:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:21:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3065 "" "Go-http-client/1.1"
Dec 03 00:21:30 compute-0 sshd-session[221083]: Invalid user intel from 61.220.235.10 port 53968
Dec 03 00:21:30 compute-0 sshd-session[221083]: Received disconnect from 61.220.235.10 port 53968:11: Bye Bye [preauth]
Dec 03 00:21:30 compute-0 sshd-session[221083]: Disconnected from invalid user intel 61.220.235.10 port 53968 [preauth]
Dec 03 00:21:31 compute-0 openstack_network_exporter[199746]: ERROR   00:21:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:21:31 compute-0 openstack_network_exporter[199746]: ERROR   00:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:21:31 compute-0 openstack_network_exporter[199746]: ERROR   00:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:21:31 compute-0 openstack_network_exporter[199746]: ERROR   00:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:21:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:21:31 compute-0 openstack_network_exporter[199746]: ERROR   00:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:21:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:21:33 compute-0 ovn_controller[95488]: 2025-12-03T00:21:33Z|00217|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 03 00:21:33 compute-0 nova_compute[187243]: 2025-12-03 00:21:33.812 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:33 compute-0 nova_compute[187243]: 2025-12-03 00:21:33.927 187247 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Check if temp file /var/lib/nova/instances/tmpzntdw8ou exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:21:33 compute-0 nova_compute[187243]: 2025-12-03 00:21:33.932 187247 DEBUG nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzntdw8ou',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7643810a-7499-484f-80e2-2a0a33cafc55',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:21:33 compute-0 nova_compute[187243]: 2025-12-03 00:21:33.994 187247 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Check if temp file /var/lib/nova/instances/tmp7an9wy_h exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:21:33 compute-0 nova_compute[187243]: 2025-12-03 00:21:33.999 187247 DEBUG nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7an9wy_h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bfaf8926-00b3-46a4-b85f-46ee074d049e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:21:34 compute-0 nova_compute[187243]: 2025-12-03 00:21:34.283 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:38 compute-0 nova_compute[187243]: 2025-12-03 00:21:38.183 187247 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:38 compute-0 nova_compute[187243]: 2025-12-03 00:21:38.245 187247 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:38 compute-0 nova_compute[187243]: 2025-12-03 00:21:38.246 187247 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:38 compute-0 nova_compute[187243]: 2025-12-03 00:21:38.301 187247 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:38 compute-0 nova_compute[187243]: 2025-12-03 00:21:38.302 187247 DEBUG nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Preparing to wait for external event network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:21:38 compute-0 nova_compute[187243]: 2025-12-03 00:21:38.302 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:38 compute-0 nova_compute[187243]: 2025-12-03 00:21:38.302 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:38 compute-0 nova_compute[187243]: 2025-12-03 00:21:38.303 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:38 compute-0 nova_compute[187243]: 2025-12-03 00:21:38.813 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:39 compute-0 podman[221091]: 2025-12-03 00:21:39.098697189 +0000 UTC m=+0.052169399 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:21:39 compute-0 nova_compute[187243]: 2025-12-03 00:21:39.296 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:43 compute-0 podman[221119]: 2025-12-03 00:21:43.091648927 +0000 UTC m=+0.050197991 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Dec 03 00:21:43 compute-0 podman[221120]: 2025-12-03 00:21:43.139924249 +0000 UTC m=+0.095979552 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Dec 03 00:21:43 compute-0 nova_compute[187243]: 2025-12-03 00:21:43.815 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:44 compute-0 nova_compute[187243]: 2025-12-03 00:21:44.346 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:46 compute-0 nova_compute[187243]: 2025-12-03 00:21:46.119 187247 DEBUG nova.compute.manager [req-20d0c400-661c-434f-a4db-1edbe9f99199 req-134bc225-3cf2-416a-bd4a-5469bbd9d05c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:21:46 compute-0 nova_compute[187243]: 2025-12-03 00:21:46.119 187247 DEBUG oslo_concurrency.lockutils [req-20d0c400-661c-434f-a4db-1edbe9f99199 req-134bc225-3cf2-416a-bd4a-5469bbd9d05c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:46 compute-0 nova_compute[187243]: 2025-12-03 00:21:46.119 187247 DEBUG oslo_concurrency.lockutils [req-20d0c400-661c-434f-a4db-1edbe9f99199 req-134bc225-3cf2-416a-bd4a-5469bbd9d05c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:46 compute-0 nova_compute[187243]: 2025-12-03 00:21:46.120 187247 DEBUG oslo_concurrency.lockutils [req-20d0c400-661c-434f-a4db-1edbe9f99199 req-134bc225-3cf2-416a-bd4a-5469bbd9d05c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:46 compute-0 nova_compute[187243]: 2025-12-03 00:21:46.120 187247 DEBUG nova.compute.manager [req-20d0c400-661c-434f-a4db-1edbe9f99199 req-134bc225-3cf2-416a-bd4a-5469bbd9d05c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] No event matching network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf in dict_keys([('network-vif-plugged', '96aba6d6-d4d8-494d-9070-4ad5c1609fdf')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:21:46 compute-0 nova_compute[187243]: 2025-12-03 00:21:46.120 187247 DEBUG nova.compute.manager [req-20d0c400-661c-434f-a4db-1edbe9f99199 req-134bc225-3cf2-416a-bd4a-5469bbd9d05c 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:21:47 compute-0 nova_compute[187243]: 2025-12-03 00:21:47.326 187247 INFO nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Took 9.02 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 03 00:21:47 compute-0 nova_compute[187243]: 2025-12-03 00:21:47.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:48 compute-0 nova_compute[187243]: 2025-12-03 00:21:48.391 187247 DEBUG nova.compute.manager [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:21:48 compute-0 nova_compute[187243]: 2025-12-03 00:21:48.392 187247 DEBUG oslo_concurrency.lockutils [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:48 compute-0 nova_compute[187243]: 2025-12-03 00:21:48.392 187247 DEBUG oslo_concurrency.lockutils [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:48 compute-0 nova_compute[187243]: 2025-12-03 00:21:48.392 187247 DEBUG oslo_concurrency.lockutils [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:48 compute-0 nova_compute[187243]: 2025-12-03 00:21:48.393 187247 DEBUG nova.compute.manager [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Processing event network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:21:48 compute-0 nova_compute[187243]: 2025-12-03 00:21:48.393 187247 DEBUG nova.compute.manager [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-changed-96aba6d6-d4d8-494d-9070-4ad5c1609fdf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:21:48 compute-0 nova_compute[187243]: 2025-12-03 00:21:48.393 187247 DEBUG nova.compute.manager [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Refreshing instance network info cache due to event network-changed-96aba6d6-d4d8-494d-9070-4ad5c1609fdf. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:21:48 compute-0 nova_compute[187243]: 2025-12-03 00:21:48.393 187247 DEBUG oslo_concurrency.lockutils [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:21:48 compute-0 nova_compute[187243]: 2025-12-03 00:21:48.393 187247 DEBUG oslo_concurrency.lockutils [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:21:48 compute-0 nova_compute[187243]: 2025-12-03 00:21:48.394 187247 DEBUG nova.network.neutron [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Refreshing network info cache for port 96aba6d6-d4d8-494d-9070-4ad5c1609fdf _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:21:48 compute-0 nova_compute[187243]: 2025-12-03 00:21:48.395 187247 DEBUG nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:21:48 compute-0 nova_compute[187243]: 2025-12-03 00:21:48.817 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:49 compute-0 nova_compute[187243]: 2025-12-03 00:21:49.347 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:49 compute-0 nova_compute[187243]: 2025-12-03 00:21:49.405 187247 WARNING neutronclient.v2_0.client [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:49 compute-0 nova_compute[187243]: 2025-12-03 00:21:49.527 187247 DEBUG nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzntdw8ou',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7643810a-7499-484f-80e2-2a0a33cafc55',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(232c77b8-b3ca-453e-acab-98823e5c2a0a),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.047 187247 DEBUG nova.objects.instance [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 7643810a-7499-484f-80e2-2a0a33cafc55 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.049 187247 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.049 187247 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.050 187247 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.552 187247 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.552 187247 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.560 187247 DEBUG nova.virt.libvirt.vif [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:20:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1137830821',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1137830',id=28,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:20:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-0dp6pegi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:20:46Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=7643810a-7499-484f-80e2-2a0a33cafc55,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.561 187247 DEBUG nova.network.os_vif_util [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.562 187247 DEBUG nova.network.os_vif_util [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.563 187247 DEBUG nova.virt.libvirt.migration [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:9a:ff:2a"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <target dev="tap96aba6d6-d4"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]: </interface>
Dec 03 00:21:50 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.563 187247 DEBUG nova.virt.libvirt.migration [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <name>instance-0000001c</name>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <uuid>7643810a-7499-484f-80e2-2a0a33cafc55</uuid>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-1137830821</nova:name>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:20:40</nova:creationTime>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:21:50 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:21:50 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:port uuid="96aba6d6-d4d8-494d-9070-4ad5c1609fdf">
Dec 03 00:21:50 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <system>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="serial">7643810a-7499-484f-80e2-2a0a33cafc55</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="uuid">7643810a-7499-484f-80e2-2a0a33cafc55</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </system>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <os>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </os>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <features>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </features>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk.config"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:9a:ff:2a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap96aba6d6-d4"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/console.log" append="off"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </target>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/console.log" append="off"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </console>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </input>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <video>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </video>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]: </domain>
Dec 03 00:21:50 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.564 187247 DEBUG nova.virt.libvirt.migration [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <name>instance-0000001c</name>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <uuid>7643810a-7499-484f-80e2-2a0a33cafc55</uuid>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-1137830821</nova:name>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:20:40</nova:creationTime>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:21:50 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:21:50 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:port uuid="96aba6d6-d4d8-494d-9070-4ad5c1609fdf">
Dec 03 00:21:50 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <system>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="serial">7643810a-7499-484f-80e2-2a0a33cafc55</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="uuid">7643810a-7499-484f-80e2-2a0a33cafc55</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </system>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <os>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </os>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <features>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </features>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk.config"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:9a:ff:2a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap96aba6d6-d4"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/console.log" append="off"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </target>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/console.log" append="off"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </console>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </input>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <video>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </video>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]: </domain>
Dec 03 00:21:50 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.565 187247 DEBUG nova.virt.libvirt.migration [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <name>instance-0000001c</name>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <uuid>7643810a-7499-484f-80e2-2a0a33cafc55</uuid>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-1137830821</nova:name>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:20:40</nova:creationTime>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:21:50 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:21:50 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <nova:port uuid="96aba6d6-d4d8-494d-9070-4ad5c1609fdf">
Dec 03 00:21:50 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <system>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="serial">7643810a-7499-484f-80e2-2a0a33cafc55</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="uuid">7643810a-7499-484f-80e2-2a0a33cafc55</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </system>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <os>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </os>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <features>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </features>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/disk.config"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:9a:ff:2a"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap96aba6d6-d4"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/console.log" append="off"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:21:50 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       </target>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55/console.log" append="off"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </console>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </input>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <video>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </video>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:21:50 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:21:50 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:21:50 compute-0 nova_compute[187243]: </domain>
Dec 03 00:21:50 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.565 187247 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:50 compute-0 nova_compute[187243]: 2025-12-03 00:21:50.958 187247 WARNING neutronclient.v2_0.client [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:51 compute-0 nova_compute[187243]: 2025-12-03 00:21:51.054 187247 DEBUG nova.virt.libvirt.migration [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:21:51 compute-0 nova_compute[187243]: 2025-12-03 00:21:51.054 187247 INFO nova.virt.libvirt.migration [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:21:51 compute-0 nova_compute[187243]: 2025-12-03 00:21:51.264 187247 DEBUG nova.network.neutron [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Updated VIF entry in instance network info cache for port 96aba6d6-d4d8-494d-9070-4ad5c1609fdf. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:21:51 compute-0 nova_compute[187243]: 2025-12-03 00:21:51.265 187247 DEBUG nova.network.neutron [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Updating instance_info_cache with network_info: [{"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.023 187247 DEBUG oslo_concurrency.lockutils [req-fdfdc38a-5c5b-40b5-8bb8-8f3effb675fe req-fe6bf059-4099-4766-81e1-f325df3136a4 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-7643810a-7499-484f-80e2-2a0a33cafc55" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.374 187247 INFO nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:21:52 compute-0 kernel: tap96aba6d6-d4 (unregistering): left promiscuous mode
Dec 03 00:21:52 compute-0 NetworkManager[55671]: <info>  [1764721312.4902] device (tap96aba6d6-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:21:52 compute-0 ovn_controller[95488]: 2025-12-03T00:21:52Z|00218|binding|INFO|Releasing lport 96aba6d6-d4d8-494d-9070-4ad5c1609fdf from this chassis (sb_readonly=0)
Dec 03 00:21:52 compute-0 ovn_controller[95488]: 2025-12-03T00:21:52Z|00219|binding|INFO|Setting lport 96aba6d6-d4d8-494d-9070-4ad5c1609fdf down in Southbound
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.509 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:52 compute-0 ovn_controller[95488]: 2025-12-03T00:21:52Z|00220|binding|INFO|Removing iface tap96aba6d6-d4 ovn-installed in OVS
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.512 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.516 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:ff:2a 10.100.0.8'], port_security=['fa:16:3e:9a:ff:2a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7643810a-7499-484f-80e2-2a0a33cafc55', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=96aba6d6-d4d8-494d-9070-4ad5c1609fdf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.517 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 96aba6d6-d4d8-494d-9070-4ad5c1609fdf in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 unbound from our chassis
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.518 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.525 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.534 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[289a4be0-1d19-402e-bac7-fdf06e73836d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:52 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Dec 03 00:21:52 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001c.scope: Consumed 15.632s CPU time.
Dec 03 00:21:52 compute-0 systemd-machined[153518]: Machine qemu-19-instance-0000001c terminated.
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.569 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[4f6446e3-dc13-4ce4-b5c1-e45abf4f2ea4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.573 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[fafcbf49-3346-451e-83a3-6f2761384a97]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.612 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[7286ad19-b343-4012-982f-54366cc28227]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.630 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5a3d9f6c-5fe2-452b-8c1f-03261ded82da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7ff943d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:96:25'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533500, 'reachable_time': 27608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221191, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.648 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8343d8df-20b4-4d87-a04d-073e3df6b8cd]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533512, 'tstamp': 533512}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221192, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf7ff943d-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533515, 'tstamp': 533515}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221192, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.649 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.651 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.655 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.655 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ff943d-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.656 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.656 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7ff943d-e0, col_values=(('external_ids', {'iface-id': '636cd919-869d-4a8a-92fa-ec7c18804da5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.656 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:21:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:52.657 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a2541308-535c-4bbe-86d1-8da9b2dc29d4]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:21:52 compute-0 kernel: tap96aba6d6-d4: entered promiscuous mode
Dec 03 00:21:52 compute-0 kernel: tap96aba6d6-d4 (unregistering): left promiscuous mode
Dec 03 00:21:52 compute-0 NetworkManager[55671]: <info>  [1764721312.6841] manager: (tap96aba6d6-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.687 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.728 187247 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.729 187247 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.729 187247 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.876 187247 DEBUG nova.virt.libvirt.guest [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '7643810a-7499-484f-80e2-2a0a33cafc55' (instance-0000001c) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.877 187247 INFO nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Migration operation has completed
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.877 187247 INFO nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] _post_live_migration() is started..
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.889 187247 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:52 compute-0 nova_compute[187243]: 2025-12-03 00:21:52.889 187247 WARNING neutronclient.v2_0.client [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.089 187247 DEBUG nova.compute.manager [req-2269a29e-3b9e-471f-8880-f5453bdf7285 req-ff50dab1-1774-43b4-bc74-d4054fb30a8b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.090 187247 DEBUG oslo_concurrency.lockutils [req-2269a29e-3b9e-471f-8880-f5453bdf7285 req-ff50dab1-1774-43b4-bc74-d4054fb30a8b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.090 187247 DEBUG oslo_concurrency.lockutils [req-2269a29e-3b9e-471f-8880-f5453bdf7285 req-ff50dab1-1774-43b4-bc74-d4054fb30a8b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.091 187247 DEBUG oslo_concurrency.lockutils [req-2269a29e-3b9e-471f-8880-f5453bdf7285 req-ff50dab1-1774-43b4-bc74-d4054fb30a8b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.091 187247 DEBUG nova.compute.manager [req-2269a29e-3b9e-471f-8880-f5453bdf7285 req-ff50dab1-1774-43b4-bc74-d4054fb30a8b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] No waiting events found dispatching network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.091 187247 DEBUG nova.compute.manager [req-2269a29e-3b9e-471f-8880-f5453bdf7285 req-ff50dab1-1774-43b4-bc74-d4054fb30a8b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:21:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:53.239 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:21:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:21:53.240 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.241 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.820 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.966 187247 DEBUG nova.network.neutron [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port 96aba6d6-d4d8-494d-9070-4ad5c1609fdf and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.967 187247 DEBUG nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.967 187247 DEBUG nova.virt.libvirt.vif [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:20:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1137830821',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1137830',id=28,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:20:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-0dp6pegi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:21:29Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=7643810a-7499-484f-80e2-2a0a33cafc55,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.967 187247 DEBUG nova.network.os_vif_util [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "address": "fa:16:3e:9a:ff:2a", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aba6d6-d4", "ovs_interfaceid": "96aba6d6-d4d8-494d-9070-4ad5c1609fdf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.968 187247 DEBUG nova.network.os_vif_util [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.968 187247 DEBUG os_vif [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.970 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.970 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap96aba6d6-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.971 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.972 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.973 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.973 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=592ab62f-d146-4b22-9251-f32755f23e64) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.973 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.974 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.977 187247 INFO os_vif [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:ff:2a,bridge_name='br-int',has_traffic_filtering=True,id=96aba6d6-d4d8-494d-9070-4ad5c1609fdf,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96aba6d6-d4')
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.977 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.978 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.978 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.978 187247 DEBUG nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.979 187247 INFO nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Deleting instance files /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55_del
Dec 03 00:21:53 compute-0 nova_compute[187243]: 2025-12-03 00:21:53.980 187247 INFO nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Deletion of /var/lib/nova/instances/7643810a-7499-484f-80e2-2a0a33cafc55_del complete
Dec 03 00:21:54 compute-0 nova_compute[187243]: 2025-12-03 00:21:54.102 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:54 compute-0 nova_compute[187243]: 2025-12-03 00:21:54.103 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:54 compute-0 nova_compute[187243]: 2025-12-03 00:21:54.103 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:54 compute-0 nova_compute[187243]: 2025-12-03 00:21:54.103 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:21:54 compute-0 nova_compute[187243]: 2025-12-03 00:21:54.377 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:55 compute-0 nova_compute[187243]: 2025-12-03 00:21:55.142 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:55 compute-0 nova_compute[187243]: 2025-12-03 00:21:55.193 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:55 compute-0 nova_compute[187243]: 2025-12-03 00:21:55.194 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:55 compute-0 nova_compute[187243]: 2025-12-03 00:21:55.243 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:55 compute-0 nova_compute[187243]: 2025-12-03 00:21:55.379 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:21:55 compute-0 nova_compute[187243]: 2025-12-03 00:21:55.381 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:21:55 compute-0 nova_compute[187243]: 2025-12-03 00:21:55.399 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:21:55 compute-0 nova_compute[187243]: 2025-12-03 00:21:55.400 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5666MB free_disk=73.13300704956055GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:21:55 compute-0 nova_compute[187243]: 2025-12-03 00:21:55.400 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:55 compute-0 nova_compute[187243]: 2025-12-03 00:21:55.401 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.158 187247 DEBUG nova.compute.manager [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.159 187247 DEBUG oslo_concurrency.lockutils [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.159 187247 DEBUG oslo_concurrency.lockutils [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.159 187247 DEBUG oslo_concurrency.lockutils [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.159 187247 DEBUG nova.compute.manager [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] No waiting events found dispatching network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.160 187247 WARNING nova.compute.manager [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received unexpected event network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf for instance with vm_state active and task_state migrating.
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.160 187247 DEBUG nova.compute.manager [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.160 187247 DEBUG oslo_concurrency.lockutils [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.160 187247 DEBUG oslo_concurrency.lockutils [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.160 187247 DEBUG oslo_concurrency.lockutils [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.160 187247 DEBUG nova.compute.manager [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] No waiting events found dispatching network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.161 187247 DEBUG nova.compute.manager [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-unplugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.161 187247 DEBUG nova.compute.manager [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received event network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.161 187247 DEBUG oslo_concurrency.lockutils [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.161 187247 DEBUG oslo_concurrency.lockutils [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.161 187247 DEBUG oslo_concurrency.lockutils [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.162 187247 DEBUG nova.compute.manager [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] No waiting events found dispatching network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:21:56 compute-0 nova_compute[187243]: 2025-12-03 00:21:56.162 187247 WARNING nova.compute.manager [req-6ad98ba8-741b-4066-8e86-3d5b8cb70781 req-111124c9-6e1b-422a-a1f9-65ea34fba641 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Received unexpected event network-vif-plugged-96aba6d6-d4d8-494d-9070-4ad5c1609fdf for instance with vm_state active and task_state migrating.
Dec 03 00:21:57 compute-0 nova_compute[187243]: 2025-12-03 00:21:57.171 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Updating resource usage from migration 232c77b8-b3ca-453e-acab-98823e5c2a0a
Dec 03 00:21:57 compute-0 nova_compute[187243]: 2025-12-03 00:21:57.172 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Updating resource usage from migration 2676e5ea-14e0-4423-bac9-b4312d7935f8
Dec 03 00:21:57 compute-0 nova_compute[187243]: 2025-12-03 00:21:57.198 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration 232c77b8-b3ca-453e-acab-98823e5c2a0a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:21:57 compute-0 nova_compute[187243]: 2025-12-03 00:21:57.199 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration 2676e5ea-14e0-4423-bac9-b4312d7935f8 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:21:57 compute-0 nova_compute[187243]: 2025-12-03 00:21:57.199 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:21:57 compute-0 nova_compute[187243]: 2025-12-03 00:21:57.199 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:21:55 up  1:30,  0 user,  load average: 0.35, 0.33, 0.29\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_migrating': '2', 'num_os_type_None': '2', 'num_proj_e363b47741a1476ca7e5987b6d15acb5': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:21:57 compute-0 nova_compute[187243]: 2025-12-03 00:21:57.266 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:21:57 compute-0 nova_compute[187243]: 2025-12-03 00:21:57.830 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:21:58 compute-0 podman[221218]: 2025-12-03 00:21:58.101494086 +0000 UTC m=+0.057159712 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, tcib_managed=true)
Dec 03 00:21:58 compute-0 podman[221219]: 2025-12-03 00:21:58.101514856 +0000 UTC m=+0.055186253 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter)
Dec 03 00:21:58 compute-0 nova_compute[187243]: 2025-12-03 00:21:58.382 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:21:58 compute-0 nova_compute[187243]: 2025-12-03 00:21:58.382 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.982s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:21:58 compute-0 nova_compute[187243]: 2025-12-03 00:21:58.974 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:59 compute-0 nova_compute[187243]: 2025-12-03 00:21:59.379 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:21:59 compute-0 sshd-session[221257]: Invalid user zmarin from 23.95.37.90 port 47814
Dec 03 00:21:59 compute-0 sshd-session[221257]: Received disconnect from 23.95.37.90 port 47814:11: Bye Bye [preauth]
Dec 03 00:21:59 compute-0 sshd-session[221257]: Disconnected from invalid user zmarin 23.95.37.90 port 47814 [preauth]
Dec 03 00:21:59 compute-0 podman[197600]: time="2025-12-03T00:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:21:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:21:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3067 "" "Go-http-client/1.1"
Dec 03 00:22:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:00.723 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:00.723 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:00.724 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:01 compute-0 nova_compute[187243]: 2025-12-03 00:22:01.383 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:01 compute-0 nova_compute[187243]: 2025-12-03 00:22:01.383 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:01 compute-0 nova_compute[187243]: 2025-12-03 00:22:01.383 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:01 compute-0 openstack_network_exporter[199746]: ERROR   00:22:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:22:01 compute-0 openstack_network_exporter[199746]: ERROR   00:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:22:01 compute-0 openstack_network_exporter[199746]: ERROR   00:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:22:01 compute-0 openstack_network_exporter[199746]: ERROR   00:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:22:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:22:01 compute-0 openstack_network_exporter[199746]: ERROR   00:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:22:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:22:03 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:03.242 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:03 compute-0 nova_compute[187243]: 2025-12-03 00:22:03.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:03 compute-0 nova_compute[187243]: 2025-12-03 00:22:03.976 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:04 compute-0 nova_compute[187243]: 2025-12-03 00:22:04.381 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:08 compute-0 nova_compute[187243]: 2025-12-03 00:22:08.978 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:09 compute-0 nova_compute[187243]: 2025-12-03 00:22:09.382 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:10 compute-0 nova_compute[187243]: 2025-12-03 00:22:10.025 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:10 compute-0 nova_compute[187243]: 2025-12-03 00:22:10.025 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:10 compute-0 nova_compute[187243]: 2025-12-03 00:22:10.026 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "7643810a-7499-484f-80e2-2a0a33cafc55-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:10 compute-0 podman[221260]: 2025-12-03 00:22:10.092614503 +0000 UTC m=+0.050428476 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:22:10 compute-0 nova_compute[187243]: 2025-12-03 00:22:10.541 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:10 compute-0 nova_compute[187243]: 2025-12-03 00:22:10.541 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:10 compute-0 nova_compute[187243]: 2025-12-03 00:22:10.541 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:10 compute-0 nova_compute[187243]: 2025-12-03 00:22:10.541 187247 DEBUG nova.compute.resource_tracker [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:22:11 compute-0 nova_compute[187243]: 2025-12-03 00:22:11.591 187247 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:11 compute-0 nova_compute[187243]: 2025-12-03 00:22:11.640 187247 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:11 compute-0 nova_compute[187243]: 2025-12-03 00:22:11.641 187247 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:11 compute-0 nova_compute[187243]: 2025-12-03 00:22:11.692 187247 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:11 compute-0 nova_compute[187243]: 2025-12-03 00:22:11.826 187247 WARNING nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:22:11 compute-0 nova_compute[187243]: 2025-12-03 00:22:11.827 187247 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:11 compute-0 nova_compute[187243]: 2025-12-03 00:22:11.845 187247 DEBUG oslo_concurrency.processutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:11 compute-0 nova_compute[187243]: 2025-12-03 00:22:11.846 187247 DEBUG nova.compute.resource_tracker [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5676MB free_disk=73.13300704956055GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:22:11 compute-0 nova_compute[187243]: 2025-12-03 00:22:11.846 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:11 compute-0 nova_compute[187243]: 2025-12-03 00:22:11.846 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:12 compute-0 nova_compute[187243]: 2025-12-03 00:22:12.868 187247 DEBUG nova.compute.resource_tracker [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance 7643810a-7499-484f-80e2-2a0a33cafc55 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:22:13 compute-0 nova_compute[187243]: 2025-12-03 00:22:13.379 187247 DEBUG nova.compute.resource_tracker [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:22:13 compute-0 nova_compute[187243]: 2025-12-03 00:22:13.380 187247 INFO nova.compute.resource_tracker [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Updating resource usage from migration 2676e5ea-14e0-4423-bac9-b4312d7935f8
Dec 03 00:22:13 compute-0 nova_compute[187243]: 2025-12-03 00:22:13.419 187247 DEBUG nova.compute.resource_tracker [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration 232c77b8-b3ca-453e-acab-98823e5c2a0a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:22:13 compute-0 nova_compute[187243]: 2025-12-03 00:22:13.419 187247 DEBUG nova.compute.resource_tracker [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration 2676e5ea-14e0-4423-bac9-b4312d7935f8 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:22:13 compute-0 nova_compute[187243]: 2025-12-03 00:22:13.420 187247 DEBUG nova.compute.resource_tracker [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:22:13 compute-0 nova_compute[187243]: 2025-12-03 00:22:13.420 187247 DEBUG nova.compute.resource_tracker [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:22:11 up  1:30,  0 user,  load average: 0.25, 0.31, 0.29\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_e363b47741a1476ca7e5987b6d15acb5': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:22:13 compute-0 nova_compute[187243]: 2025-12-03 00:22:13.435 187247 DEBUG nova.scheduler.client.report [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Refreshing inventories for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:22:13 compute-0 nova_compute[187243]: 2025-12-03 00:22:13.449 187247 DEBUG nova.scheduler.client.report [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Updating ProviderTree inventory for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:22:13 compute-0 nova_compute[187243]: 2025-12-03 00:22:13.449 187247 DEBUG nova.compute.provider_tree [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:22:13 compute-0 nova_compute[187243]: 2025-12-03 00:22:13.459 187247 DEBUG nova.scheduler.client.report [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Refreshing aggregate associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:22:13 compute-0 nova_compute[187243]: 2025-12-03 00:22:13.475 187247 DEBUG nova.scheduler.client.report [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Refreshing trait associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_SOUND_MODEL_AC97,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOUND_MODEL_ICH9,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:22:13 compute-0 nova_compute[187243]: 2025-12-03 00:22:13.523 187247 DEBUG nova.compute.provider_tree [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:22:13 compute-0 nova_compute[187243]: 2025-12-03 00:22:13.980 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:14 compute-0 nova_compute[187243]: 2025-12-03 00:22:14.030 187247 DEBUG nova.scheduler.client.report [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:22:14 compute-0 podman[221291]: 2025-12-03 00:22:14.107419749 +0000 UTC m=+0.069295469 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 00:22:14 compute-0 podman[221292]: 2025-12-03 00:22:14.135388454 +0000 UTC m=+0.089996376 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 03 00:22:14 compute-0 nova_compute[187243]: 2025-12-03 00:22:14.384 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:14 compute-0 nova_compute[187243]: 2025-12-03 00:22:14.542 187247 DEBUG nova.compute.resource_tracker [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:22:14 compute-0 nova_compute[187243]: 2025-12-03 00:22:14.543 187247 DEBUG oslo_concurrency.lockutils [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.696s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:14 compute-0 nova_compute[187243]: 2025-12-03 00:22:14.565 187247 INFO nova.compute.manager [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 03 00:22:15 compute-0 nova_compute[187243]: 2025-12-03 00:22:15.672 187247 INFO nova.scheduler.client.report [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration 232c77b8-b3ca-453e-acab-98823e5c2a0a
Dec 03 00:22:15 compute-0 nova_compute[187243]: 2025-12-03 00:22:15.672 187247 DEBUG nova.virt.libvirt.driver [None req-ccb0a519-4860-4996-9d06-c95fce88a95f 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 7643810a-7499-484f-80e2-2a0a33cafc55] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:22:16 compute-0 nova_compute[187243]: 2025-12-03 00:22:16.691 187247 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:16 compute-0 nova_compute[187243]: 2025-12-03 00:22:16.753 187247 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:16 compute-0 nova_compute[187243]: 2025-12-03 00:22:16.754 187247 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:16 compute-0 nova_compute[187243]: 2025-12-03 00:22:16.833 187247 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:16 compute-0 nova_compute[187243]: 2025-12-03 00:22:16.835 187247 DEBUG nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Preparing to wait for external event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:22:16 compute-0 nova_compute[187243]: 2025-12-03 00:22:16.835 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:16 compute-0 nova_compute[187243]: 2025-12-03 00:22:16.835 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:16 compute-0 nova_compute[187243]: 2025-12-03 00:22:16.835 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:17 compute-0 nova_compute[187243]: 2025-12-03 00:22:17.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:17 compute-0 nova_compute[187243]: 2025-12-03 00:22:17.593 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:17 compute-0 nova_compute[187243]: 2025-12-03 00:22:17.594 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:17 compute-0 nova_compute[187243]: 2025-12-03 00:22:17.594 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:17 compute-0 nova_compute[187243]: 2025-12-03 00:22:17.594 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:17 compute-0 nova_compute[187243]: 2025-12-03 00:22:17.595 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:17 compute-0 nova_compute[187243]: 2025-12-03 00:22:17.595 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:18 compute-0 sshd-session[221340]: Invalid user andy from 20.123.120.169 port 44502
Dec 03 00:22:18 compute-0 sshd-session[221340]: Received disconnect from 20.123.120.169 port 44502:11: Bye Bye [preauth]
Dec 03 00:22:18 compute-0 sshd-session[221340]: Disconnected from invalid user andy 20.123.120.169 port 44502 [preauth]
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.610 187247 DEBUG nova.virt.libvirt.imagecache [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:314
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.611 187247 DEBUG nova.virt.libvirt.imagecache [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Image id 92e79321-71af-44a0-869c-1d5a9da5fefc yields fingerprint 4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:319
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.611 187247 INFO nova.virt.libvirt.imagecache [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] image 92e79321-71af-44a0-869c-1d5a9da5fefc at (/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0): checking
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.611 187247 DEBUG nova.virt.libvirt.imagecache [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] image 92e79321-71af-44a0-869c-1d5a9da5fefc at (/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0): image is in use _mark_in_use /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:279
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.613 187247 DEBUG nova.virt.libvirt.imagecache [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:319
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.613 187247 DEBUG nova.virt.libvirt.imagecache [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] bfaf8926-00b3-46a4-b85f-46ee074d049e is a valid instance name _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:126
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.613 187247 DEBUG nova.virt.libvirt.imagecache [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] bfaf8926-00b3-46a4-b85f-46ee074d049e has a disk file _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:129
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.614 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.664 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.665 187247 DEBUG nova.virt.libvirt.imagecache [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance bfaf8926-00b3-46a4-b85f-46ee074d049e is backed by 4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:141
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.665 187247 INFO nova.virt.libvirt.imagecache [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Active base files: /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.666 187247 DEBUG nova.virt.libvirt.imagecache [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:350
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.666 187247 DEBUG nova.virt.libvirt.imagecache [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:299
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.666 187247 DEBUG nova.virt.libvirt.imagecache [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:284
Dec 03 00:22:18 compute-0 nova_compute[187243]: 2025-12-03 00:22:18.982 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:19 compute-0 nova_compute[187243]: 2025-12-03 00:22:19.387 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:22 compute-0 sshd-session[221359]: Invalid user bodega from 102.210.148.92 port 56430
Dec 03 00:22:23 compute-0 sshd-session[221359]: Received disconnect from 102.210.148.92 port 56430:11: Bye Bye [preauth]
Dec 03 00:22:23 compute-0 sshd-session[221359]: Disconnected from invalid user bodega 102.210.148.92 port 56430 [preauth]
Dec 03 00:22:23 compute-0 ovn_controller[95488]: 2025-12-03T00:22:23Z|00221|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Dec 03 00:22:23 compute-0 nova_compute[187243]: 2025-12-03 00:22:23.984 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:24 compute-0 nova_compute[187243]: 2025-12-03 00:22:24.390 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:25 compute-0 nova_compute[187243]: 2025-12-03 00:22:25.114 187247 DEBUG nova.compute.manager [req-e1ec62a9-d166-4127-b9f5-7f6d26e31df1 req-f5cd321f-fbd7-413e-bcdb-9d2dc9a23f5f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:25 compute-0 nova_compute[187243]: 2025-12-03 00:22:25.115 187247 DEBUG oslo_concurrency.lockutils [req-e1ec62a9-d166-4127-b9f5-7f6d26e31df1 req-f5cd321f-fbd7-413e-bcdb-9d2dc9a23f5f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:25 compute-0 nova_compute[187243]: 2025-12-03 00:22:25.115 187247 DEBUG oslo_concurrency.lockutils [req-e1ec62a9-d166-4127-b9f5-7f6d26e31df1 req-f5cd321f-fbd7-413e-bcdb-9d2dc9a23f5f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:25 compute-0 nova_compute[187243]: 2025-12-03 00:22:25.115 187247 DEBUG oslo_concurrency.lockutils [req-e1ec62a9-d166-4127-b9f5-7f6d26e31df1 req-f5cd321f-fbd7-413e-bcdb-9d2dc9a23f5f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:25 compute-0 nova_compute[187243]: 2025-12-03 00:22:25.115 187247 DEBUG nova.compute.manager [req-e1ec62a9-d166-4127-b9f5-7f6d26e31df1 req-f5cd321f-fbd7-413e-bcdb-9d2dc9a23f5f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] No event matching network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 in dict_keys([('network-vif-plugged', 'bd61e9e8-f7f0-458d-858f-ffb409383310')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:22:25 compute-0 nova_compute[187243]: 2025-12-03 00:22:25.116 187247 DEBUG nova.compute.manager [req-e1ec62a9-d166-4127-b9f5-7f6d26e31df1 req-f5cd321f-fbd7-413e-bcdb-9d2dc9a23f5f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:22:27 compute-0 nova_compute[187243]: 2025-12-03 00:22:27.195 187247 DEBUG nova.compute.manager [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:27 compute-0 nova_compute[187243]: 2025-12-03 00:22:27.195 187247 DEBUG oslo_concurrency.lockutils [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:27 compute-0 nova_compute[187243]: 2025-12-03 00:22:27.196 187247 DEBUG oslo_concurrency.lockutils [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:27 compute-0 nova_compute[187243]: 2025-12-03 00:22:27.196 187247 DEBUG oslo_concurrency.lockutils [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:27 compute-0 nova_compute[187243]: 2025-12-03 00:22:27.197 187247 DEBUG nova.compute.manager [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Processing event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:22:27 compute-0 nova_compute[187243]: 2025-12-03 00:22:27.197 187247 DEBUG nova.compute.manager [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-changed-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:27 compute-0 nova_compute[187243]: 2025-12-03 00:22:27.197 187247 DEBUG nova.compute.manager [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Refreshing instance network info cache due to event network-changed-bd61e9e8-f7f0-458d-858f-ffb409383310. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:22:27 compute-0 nova_compute[187243]: 2025-12-03 00:22:27.198 187247 DEBUG oslo_concurrency.lockutils [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:22:27 compute-0 nova_compute[187243]: 2025-12-03 00:22:27.198 187247 DEBUG oslo_concurrency.lockutils [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:22:27 compute-0 nova_compute[187243]: 2025-12-03 00:22:27.199 187247 DEBUG nova.network.neutron [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Refreshing network info cache for port bd61e9e8-f7f0-458d-858f-ffb409383310 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:22:27 compute-0 nova_compute[187243]: 2025-12-03 00:22:27.706 187247 WARNING neutronclient.v2_0.client [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:27 compute-0 nova_compute[187243]: 2025-12-03 00:22:27.862 187247 INFO nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Took 11.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 03 00:22:27 compute-0 nova_compute[187243]: 2025-12-03 00:22:27.863 187247 DEBUG nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:22:28 compute-0 nova_compute[187243]: 2025-12-03 00:22:28.369 187247 DEBUG nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7an9wy_h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bfaf8926-00b3-46a4-b85f-46ee074d049e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(2676e5ea-14e0-4423-bac9-b4312d7935f8),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:22:28 compute-0 nova_compute[187243]: 2025-12-03 00:22:28.469 187247 WARNING neutronclient.v2_0.client [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:28 compute-0 nova_compute[187243]: 2025-12-03 00:22:28.987 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.029 187247 DEBUG nova.network.neutron [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Updated VIF entry in instance network info cache for port bd61e9e8-f7f0-458d-858f-ffb409383310. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.030 187247 DEBUG nova.network.neutron [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Updating instance_info_cache with network_info: [{"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.036 187247 DEBUG nova.objects.instance [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid bfaf8926-00b3-46a4-b85f-46ee074d049e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.038 187247 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.040 187247 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.040 187247 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:22:29 compute-0 podman[221361]: 2025-12-03 00:22:29.095500246 +0000 UTC m=+0.056995067 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 03 00:22:29 compute-0 podman[221362]: 2025-12-03 00:22:29.108803232 +0000 UTC m=+0.060806471 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.391 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.542 187247 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.543 187247 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:22:29 compute-0 podman[197600]: time="2025-12-03T00:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:22:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:22:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3071 "" "Go-http-client/1.1"
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.964 187247 DEBUG nova.virt.libvirt.vif [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:20:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-350860971',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-3508609',id=29,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:21:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-7vu1e8y1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:21:05Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=bfaf8926-00b3-46a4-b85f-46ee074d049e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.964 187247 DEBUG nova.network.os_vif_util [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.965 187247 DEBUG nova.network.os_vif_util [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.966 187247 DEBUG nova.virt.libvirt.migration [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:03:5f:29"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <target dev="tapbd61e9e8-f7"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]: </interface>
Dec 03 00:22:29 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.966 187247 DEBUG nova.virt.libvirt.migration [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <name>instance-0000001d</name>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <uuid>bfaf8926-00b3-46a4-b85f-46ee074d049e</uuid>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-350860971</nova:name>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:20:58</nova:creationTime>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:22:29 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:22:29 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:port uuid="bd61e9e8-f7f0-458d-858f-ffb409383310">
Dec 03 00:22:29 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <system>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="serial">bfaf8926-00b3-46a4-b85f-46ee074d049e</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="uuid">bfaf8926-00b3-46a4-b85f-46ee074d049e</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </system>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <os>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </os>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <features>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </features>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk.config"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:03:5f:29"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbd61e9e8-f7"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/console.log" append="off"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </target>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/console.log" append="off"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </console>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </input>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <video>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </video>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]: </domain>
Dec 03 00:22:29 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.968 187247 DEBUG nova.virt.libvirt.migration [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <name>instance-0000001d</name>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <uuid>bfaf8926-00b3-46a4-b85f-46ee074d049e</uuid>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-350860971</nova:name>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:20:58</nova:creationTime>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:22:29 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:22:29 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:port uuid="bd61e9e8-f7f0-458d-858f-ffb409383310">
Dec 03 00:22:29 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <system>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="serial">bfaf8926-00b3-46a4-b85f-46ee074d049e</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="uuid">bfaf8926-00b3-46a4-b85f-46ee074d049e</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </system>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <os>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </os>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <features>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </features>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk.config"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:03:5f:29"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbd61e9e8-f7"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/console.log" append="off"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </target>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/console.log" append="off"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </console>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </input>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <video>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </video>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]: </domain>
Dec 03 00:22:29 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.970 187247 DEBUG nova.virt.libvirt.migration [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <name>instance-0000001d</name>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <uuid>bfaf8926-00b3-46a4-b85f-46ee074d049e</uuid>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-350860971</nova:name>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:20:58</nova:creationTime>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:22:29 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:22:29 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:user uuid="db24d5b25a924602ae8a7dc539bc6cbf">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin</nova:user>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:project uuid="e363b47741a1476ca7e5987b6d15acb5">tempest-TestExecuteWorkloadStabilizationStrategy-2023481445</nova:project>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <nova:port uuid="bd61e9e8-f7f0-458d-858f-ffb409383310">
Dec 03 00:22:29 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <system>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="serial">bfaf8926-00b3-46a4-b85f-46ee074d049e</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="uuid">bfaf8926-00b3-46a4-b85f-46ee074d049e</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </system>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <os>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </os>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <features>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </features>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/disk.config"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:03:5f:29"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbd61e9e8-f7"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/console.log" append="off"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:22:29 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       </target>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e/console.log" append="off"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </console>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </input>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <video>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </video>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:22:29 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:22:29 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:22:29 compute-0 nova_compute[187243]: </domain>
Dec 03 00:22:29 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:22:29 compute-0 nova_compute[187243]: 2025-12-03 00:22:29.972 187247 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:22:30 compute-0 nova_compute[187243]: 2025-12-03 00:22:30.045 187247 DEBUG nova.virt.libvirt.migration [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:22:30 compute-0 nova_compute[187243]: 2025-12-03 00:22:30.045 187247 INFO nova.virt.libvirt.migration [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:22:30 compute-0 nova_compute[187243]: 2025-12-03 00:22:30.331 187247 DEBUG oslo_concurrency.lockutils [req-73d92166-7fc5-4790-a26f-acc1ef5b9c71 req-4e81eef5-5d86-421c-ba0f-a3a06dc79ce5 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-bfaf8926-00b3-46a4-b85f-46ee074d049e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:22:31 compute-0 nova_compute[187243]: 2025-12-03 00:22:31.102 187247 INFO nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:22:31 compute-0 openstack_network_exporter[199746]: ERROR   00:22:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:22:31 compute-0 openstack_network_exporter[199746]: ERROR   00:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:22:31 compute-0 openstack_network_exporter[199746]: ERROR   00:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:22:31 compute-0 openstack_network_exporter[199746]: ERROR   00:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:22:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:22:31 compute-0 openstack_network_exporter[199746]: ERROR   00:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:22:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:22:31 compute-0 nova_compute[187243]: 2025-12-03 00:22:31.607 187247 DEBUG nova.virt.libvirt.migration [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:22:31 compute-0 nova_compute[187243]: 2025-12-03 00:22:31.608 187247 DEBUG nova.virt.libvirt.migration [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.113 187247 DEBUG nova.virt.libvirt.migration [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.114 187247 DEBUG nova.virt.libvirt.migration [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 03 00:22:32 compute-0 kernel: tapbd61e9e8-f7 (unregistering): left promiscuous mode
Dec 03 00:22:32 compute-0 NetworkManager[55671]: <info>  [1764721352.2141] device (tapbd61e9e8-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:22:32 compute-0 ovn_controller[95488]: 2025-12-03T00:22:32Z|00222|binding|INFO|Releasing lport bd61e9e8-f7f0-458d-858f-ffb409383310 from this chassis (sb_readonly=0)
Dec 03 00:22:32 compute-0 ovn_controller[95488]: 2025-12-03T00:22:32Z|00223|binding|INFO|Setting lport bd61e9e8-f7f0-458d-858f-ffb409383310 down in Southbound
Dec 03 00:22:32 compute-0 ovn_controller[95488]: 2025-12-03T00:22:32Z|00224|binding|INFO|Removing iface tapbd61e9e8-f7 ovn-installed in OVS
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.231 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.233 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.246 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:32 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Dec 03 00:22:32 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001d.scope: Consumed 15.945s CPU time.
Dec 03 00:22:32 compute-0 systemd-machined[153518]: Machine qemu-20-instance-0000001d terminated.
Dec 03 00:22:32 compute-0 ovn_controller[95488]: 2025-12-03T00:22:32Z|00225|binding|INFO|Releasing lport 636cd919-869d-4a8a-92fa-ec7c18804da5 from this chassis (sb_readonly=0)
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.322 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:5f:29 10.100.0.13'], port_security=['fa:16:3e:03:5f:29 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'bfaf8926-00b3-46a4-b85f-46ee074d049e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e363b47741a1476ca7e5987b6d15acb5', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'aa7bb0b8-346d-4df1-ade9-c8e68672df4d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=89eb3cb7-f38b-4b5d-9a3e-18a5b3602924, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=bd61e9e8-f7f0-458d-858f-ffb409383310) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.323 104379 INFO neutron.agent.ovn.metadata.agent [-] Port bd61e9e8-f7f0-458d-858f-ffb409383310 in datapath f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 unbound from our chassis
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.323 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.324 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a49575f4-f7cb-44ac-997e-31bed9acd8ed]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.325 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 namespace which is not needed anymore
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.365 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:32 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220821]: [NOTICE]   (220825) : haproxy version is 3.0.5-8e879a5
Dec 03 00:22:32 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220821]: [NOTICE]   (220825) : path to executable is /usr/sbin/haproxy
Dec 03 00:22:32 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220821]: [WARNING]  (220825) : Exiting Master process...
Dec 03 00:22:32 compute-0 podman[221434]: 2025-12-03 00:22:32.431389549 +0000 UTC m=+0.030316434 container kill 1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 03 00:22:32 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220821]: [ALERT]    (220825) : Current worker (220827) exited with code 143 (Terminated)
Dec 03 00:22:32 compute-0 neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89[220821]: [WARNING]  (220825) : All workers exited. Exiting... (0)
Dec 03 00:22:32 compute-0 systemd[1]: libpod-1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4.scope: Deactivated successfully.
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.446 187247 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.446 187247 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.446 187247 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:22:32 compute-0 podman[221462]: 2025-12-03 00:22:32.469470092 +0000 UTC m=+0.022698967 container died 1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 03 00:22:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4-userdata-shm.mount: Deactivated successfully.
Dec 03 00:22:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-dd186f06f51cbd737a3dcb973cb3a14faf33a33b0390bc2cc9298553b805d811-merged.mount: Deactivated successfully.
Dec 03 00:22:32 compute-0 podman[221462]: 2025-12-03 00:22:32.509314278 +0000 UTC m=+0.062543153 container cleanup 1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 03 00:22:32 compute-0 systemd[1]: libpod-conmon-1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4.scope: Deactivated successfully.
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.517 187247 DEBUG nova.compute.manager [req-9bd2aa96-9d3c-4483-bae8-f6b953b65121 req-ea8eec44-deeb-4042-859c-6e8c39842423 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.518 187247 DEBUG oslo_concurrency.lockutils [req-9bd2aa96-9d3c-4483-bae8-f6b953b65121 req-ea8eec44-deeb-4042-859c-6e8c39842423 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.518 187247 DEBUG oslo_concurrency.lockutils [req-9bd2aa96-9d3c-4483-bae8-f6b953b65121 req-ea8eec44-deeb-4042-859c-6e8c39842423 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.518 187247 DEBUG oslo_concurrency.lockutils [req-9bd2aa96-9d3c-4483-bae8-f6b953b65121 req-ea8eec44-deeb-4042-859c-6e8c39842423 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.518 187247 DEBUG nova.compute.manager [req-9bd2aa96-9d3c-4483-bae8-f6b953b65121 req-ea8eec44-deeb-4042-859c-6e8c39842423 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] No waiting events found dispatching network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.519 187247 DEBUG nova.compute.manager [req-9bd2aa96-9d3c-4483-bae8-f6b953b65121 req-ea8eec44-deeb-4042-859c-6e8c39842423 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:22:32 compute-0 podman[221473]: 2025-12-03 00:22:32.525674489 +0000 UTC m=+0.066740756 container remove 1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.539 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[94157f69-c3c8-497a-92be-de4cf088c355]: (4, ("Wed Dec  3 12:22:32 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 (1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4)\n1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4\nWed Dec  3 12:22:32 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 (1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4)\n1a0bb9d4b0289324ffba1e79a06898b56fd3953911dd24786a7fb411a6c924f4\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.540 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[64831350-9006-4273-81a1-d2b1915d27de]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.541 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.541 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d79e67c0-52e7-4c3e-b21a-a9ae6b84b3ec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.542 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ff943d-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.543 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:32 compute-0 kernel: tapf7ff943d-e0: left promiscuous mode
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.556 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.559 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5d68c4bc-4277-4e02-8dac-1317af4084fc]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.577 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9e507a3f-be2d-42b9-9d22-e922a4ad2047]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.579 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5394f1cf-63ae-4f62-bbcb-3c7eb3b015e2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.593 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[eb0c982f-379b-4188-818a-cc3133d6fb2a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533493, 'reachable_time': 36041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221502, 'error': None, 'target': 'ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.595 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:22:32 compute-0 systemd[1]: run-netns-ovnmeta\x2df7ff943d\x2de57d\x2d4bc2\x2d8dd6\x2df8a8bb6e4f89.mount: Deactivated successfully.
Dec 03 00:22:32 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:32.596 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[f64bd24d-5627-4f5d-b6a9-5cd2a027c65f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.616 187247 DEBUG nova.virt.libvirt.guest [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'bfaf8926-00b3-46a4-b85f-46ee074d049e' (instance-0000001d) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.617 187247 INFO nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Migration operation has completed
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.617 187247 INFO nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] _post_live_migration() is started..
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.656 187247 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:32 compute-0 nova_compute[187243]: 2025-12-03 00:22:32.657 187247 WARNING neutronclient.v2_0.client [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.248 187247 DEBUG nova.compute.manager [req-6b4cc3c2-bf39-4203-ae8b-2e6773280f44 req-103a704f-c0d7-454c-89a4-b58fe097f14b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.248 187247 DEBUG oslo_concurrency.lockutils [req-6b4cc3c2-bf39-4203-ae8b-2e6773280f44 req-103a704f-c0d7-454c-89a4-b58fe097f14b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.248 187247 DEBUG oslo_concurrency.lockutils [req-6b4cc3c2-bf39-4203-ae8b-2e6773280f44 req-103a704f-c0d7-454c-89a4-b58fe097f14b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.249 187247 DEBUG oslo_concurrency.lockutils [req-6b4cc3c2-bf39-4203-ae8b-2e6773280f44 req-103a704f-c0d7-454c-89a4-b58fe097f14b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.249 187247 DEBUG nova.compute.manager [req-6b4cc3c2-bf39-4203-ae8b-2e6773280f44 req-103a704f-c0d7-454c-89a4-b58fe097f14b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] No waiting events found dispatching network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.249 187247 DEBUG nova.compute.manager [req-6b4cc3c2-bf39-4203-ae8b-2e6773280f44 req-103a704f-c0d7-454c-89a4-b58fe097f14b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.340 187247 DEBUG nova.network.neutron [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port bd61e9e8-f7f0-458d-858f-ffb409383310 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.341 187247 DEBUG nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.342 187247 DEBUG nova.virt.libvirt.vif [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:20:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-350860971',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-3508609',id=29,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:21:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e363b47741a1476ca7e5987b6d15acb5',ramdisk_id='',reservation_id='r-7vu1e8y1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-2023481445-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:21:29Z,user_data=None,user_id='db24d5b25a924602ae8a7dc539bc6cbf',uuid=bfaf8926-00b3-46a4-b85f-46ee074d049e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.342 187247 DEBUG nova.network.os_vif_util [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "bd61e9e8-f7f0-458d-858f-ffb409383310", "address": "fa:16:3e:03:5f:29", "network": {"id": "f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2089797190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3da51bfd7f1c491b839f6b6b49056c8b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd61e9e8-f7", "ovs_interfaceid": "bd61e9e8-f7f0-458d-858f-ffb409383310", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.343 187247 DEBUG nova.network.os_vif_util [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.343 187247 DEBUG os_vif [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.347 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.347 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd61e9e8-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.349 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.350 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.350 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.351 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0cc557a1-81a1-4dca-82fe-4d9ed316430b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.351 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.352 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.354 187247 INFO os_vif [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:5f:29,bridge_name='br-int',has_traffic_filtering=True,id=bd61e9e8-f7f0-458d-858f-ffb409383310,network=Network(f7ff943d-e57d-4bc2-8dd6-f8a8bb6e4f89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd61e9e8-f7')
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.354 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.355 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.355 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.355 187247 DEBUG nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.355 187247 INFO nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Deleting instance files /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e_del
Dec 03 00:22:33 compute-0 nova_compute[187243]: 2025-12-03 00:22:33.356 187247 INFO nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Deletion of /var/lib/nova/instances/bfaf8926-00b3-46a4-b85f-46ee074d049e_del complete
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.392 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.756 187247 DEBUG nova.compute.manager [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.756 187247 DEBUG oslo_concurrency.lockutils [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.757 187247 DEBUG oslo_concurrency.lockutils [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.757 187247 DEBUG oslo_concurrency.lockutils [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.757 187247 DEBUG nova.compute.manager [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] No waiting events found dispatching network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.757 187247 WARNING nova.compute.manager [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received unexpected event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 for instance with vm_state active and task_state migrating.
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.757 187247 DEBUG nova.compute.manager [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.757 187247 DEBUG oslo_concurrency.lockutils [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.757 187247 DEBUG oslo_concurrency.lockutils [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.758 187247 DEBUG oslo_concurrency.lockutils [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.758 187247 DEBUG nova.compute.manager [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] No waiting events found dispatching network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.758 187247 DEBUG nova.compute.manager [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-unplugged-bd61e9e8-f7f0-458d-858f-ffb409383310 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.758 187247 DEBUG nova.compute.manager [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.758 187247 DEBUG oslo_concurrency.lockutils [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.758 187247 DEBUG oslo_concurrency.lockutils [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.758 187247 DEBUG oslo_concurrency.lockutils [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.759 187247 DEBUG nova.compute.manager [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] No waiting events found dispatching network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.759 187247 WARNING nova.compute.manager [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received unexpected event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 for instance with vm_state active and task_state migrating.
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.759 187247 DEBUG nova.compute.manager [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.759 187247 DEBUG oslo_concurrency.lockutils [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.759 187247 DEBUG oslo_concurrency.lockutils [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.759 187247 DEBUG oslo_concurrency.lockutils [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.759 187247 DEBUG nova.compute.manager [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] No waiting events found dispatching network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:22:34 compute-0 nova_compute[187243]: 2025-12-03 00:22:34.759 187247 WARNING nova.compute.manager [req-7b37b313-2a37-4844-8d65-88817d6a4584 req-c58151d6-03f6-49af-90d4-2fc7ad91a65d 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Received unexpected event network-vif-plugged-bd61e9e8-f7f0-458d-858f-ffb409383310 for instance with vm_state active and task_state migrating.
Dec 03 00:22:38 compute-0 nova_compute[187243]: 2025-12-03 00:22:38.351 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:39 compute-0 nova_compute[187243]: 2025-12-03 00:22:39.395 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:41 compute-0 podman[221503]: 2025-12-03 00:22:41.102676757 +0000 UTC m=+0.056597308 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:22:43 compute-0 nova_compute[187243]: 2025-12-03 00:22:43.353 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:43 compute-0 nova_compute[187243]: 2025-12-03 00:22:43.889 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:43 compute-0 nova_compute[187243]: 2025-12-03 00:22:43.889 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:43 compute-0 nova_compute[187243]: 2025-12-03 00:22:43.889 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "bfaf8926-00b3-46a4-b85f-46ee074d049e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:44 compute-0 nova_compute[187243]: 2025-12-03 00:22:44.397 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:44 compute-0 nova_compute[187243]: 2025-12-03 00:22:44.420 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:44 compute-0 nova_compute[187243]: 2025-12-03 00:22:44.420 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:44 compute-0 nova_compute[187243]: 2025-12-03 00:22:44.420 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:44 compute-0 nova_compute[187243]: 2025-12-03 00:22:44.421 187247 DEBUG nova.compute.resource_tracker [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:22:44 compute-0 podman[221528]: 2025-12-03 00:22:44.551108926 +0000 UTC m=+0.089436352 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:22:44 compute-0 podman[221529]: 2025-12-03 00:22:44.566459162 +0000 UTC m=+0.101149969 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, tcib_managed=true)
Dec 03 00:22:44 compute-0 nova_compute[187243]: 2025-12-03 00:22:44.626 187247 WARNING nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:22:44 compute-0 nova_compute[187243]: 2025-12-03 00:22:44.627 187247 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:44 compute-0 nova_compute[187243]: 2025-12-03 00:22:44.648 187247 DEBUG oslo_concurrency.processutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:44 compute-0 nova_compute[187243]: 2025-12-03 00:22:44.650 187247 DEBUG nova.compute.resource_tracker [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5837MB free_disk=73.16231155395508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:22:44 compute-0 nova_compute[187243]: 2025-12-03 00:22:44.650 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:44 compute-0 nova_compute[187243]: 2025-12-03 00:22:44.651 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:45 compute-0 nova_compute[187243]: 2025-12-03 00:22:45.675 187247 DEBUG nova.compute.resource_tracker [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance bfaf8926-00b3-46a4-b85f-46ee074d049e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:22:46 compute-0 nova_compute[187243]: 2025-12-03 00:22:46.182 187247 DEBUG nova.compute.resource_tracker [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:22:46 compute-0 nova_compute[187243]: 2025-12-03 00:22:46.208 187247 DEBUG nova.compute.resource_tracker [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration 2676e5ea-14e0-4423-bac9-b4312d7935f8 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:22:46 compute-0 nova_compute[187243]: 2025-12-03 00:22:46.209 187247 DEBUG nova.compute.resource_tracker [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:22:46 compute-0 nova_compute[187243]: 2025-12-03 00:22:46.209 187247 DEBUG nova.compute.resource_tracker [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:22:44 up  1:30,  0 user,  load average: 0.15, 0.28, 0.28\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:22:46 compute-0 nova_compute[187243]: 2025-12-03 00:22:46.238 187247 DEBUG nova.compute.provider_tree [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:22:46 compute-0 nova_compute[187243]: 2025-12-03 00:22:46.746 187247 DEBUG nova.scheduler.client.report [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:22:47 compute-0 nova_compute[187243]: 2025-12-03 00:22:47.260 187247 DEBUG nova.compute.resource_tracker [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:22:47 compute-0 nova_compute[187243]: 2025-12-03 00:22:47.260 187247 DEBUG oslo_concurrency.lockutils [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.609s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:47 compute-0 nova_compute[187243]: 2025-12-03 00:22:47.343 187247 INFO nova.compute.manager [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 03 00:22:48 compute-0 nova_compute[187243]: 2025-12-03 00:22:48.354 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:48 compute-0 nova_compute[187243]: 2025-12-03 00:22:48.414 187247 INFO nova.scheduler.client.report [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration 2676e5ea-14e0-4423-bac9-b4312d7935f8
Dec 03 00:22:48 compute-0 nova_compute[187243]: 2025-12-03 00:22:48.415 187247 DEBUG nova.virt.libvirt.driver [None req-f7706249-0564-4967-994b-42ce5bf6d341 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: bfaf8926-00b3-46a4-b85f-46ee074d049e] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:22:49 compute-0 nova_compute[187243]: 2025-12-03 00:22:49.401 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:50 compute-0 nova_compute[187243]: 2025-12-03 00:22:50.666 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:50 compute-0 nova_compute[187243]: 2025-12-03 00:22:50.666 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:51 compute-0 nova_compute[187243]: 2025-12-03 00:22:51.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:53 compute-0 nova_compute[187243]: 2025-12-03 00:22:53.357 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:54 compute-0 nova_compute[187243]: 2025-12-03 00:22:54.464 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:55 compute-0 nova_compute[187243]: 2025-12-03 00:22:55.100 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:55 compute-0 sshd-session[221577]: Invalid user bodega from 61.220.235.10 port 53120
Dec 03 00:22:55 compute-0 sshd-session[221577]: Received disconnect from 61.220.235.10 port 53120:11: Bye Bye [preauth]
Dec 03 00:22:55 compute-0 sshd-session[221577]: Disconnected from invalid user bodega 61.220.235.10 port 53120 [preauth]
Dec 03 00:22:55 compute-0 nova_compute[187243]: 2025-12-03 00:22:55.620 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:55 compute-0 nova_compute[187243]: 2025-12-03 00:22:55.621 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:55 compute-0 nova_compute[187243]: 2025-12-03 00:22:55.621 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:55 compute-0 nova_compute[187243]: 2025-12-03 00:22:55.621 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:22:55 compute-0 nova_compute[187243]: 2025-12-03 00:22:55.747 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:22:55 compute-0 nova_compute[187243]: 2025-12-03 00:22:55.748 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:22:55 compute-0 nova_compute[187243]: 2025-12-03 00:22:55.764 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:22:55 compute-0 nova_compute[187243]: 2025-12-03 00:22:55.765 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5837MB free_disk=73.16228866577148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:22:55 compute-0 nova_compute[187243]: 2025-12-03 00:22:55.765 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:22:55 compute-0 nova_compute[187243]: 2025-12-03 00:22:55.765 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:22:56 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:56.060 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:22:56 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:22:56.060 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:22:56 compute-0 nova_compute[187243]: 2025-12-03 00:22:56.061 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:56 compute-0 nova_compute[187243]: 2025-12-03 00:22:56.817 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:22:56 compute-0 nova_compute[187243]: 2025-12-03 00:22:56.818 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:22:55 up  1:31,  0 user,  load average: 0.12, 0.26, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:22:56 compute-0 nova_compute[187243]: 2025-12-03 00:22:56.837 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:22:57 compute-0 nova_compute[187243]: 2025-12-03 00:22:57.353 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:22:57 compute-0 sshd-session[221573]: Invalid user temp from 45.78.219.213 port 59806
Dec 03 00:22:57 compute-0 nova_compute[187243]: 2025-12-03 00:22:57.863 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:22:57 compute-0 nova_compute[187243]: 2025-12-03 00:22:57.864 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:22:58 compute-0 sshd-session[221575]: Invalid user vncuser from 45.78.219.95 port 34804
Dec 03 00:22:58 compute-0 nova_compute[187243]: 2025-12-03 00:22:58.355 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:58 compute-0 nova_compute[187243]: 2025-12-03 00:22:58.355 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:58 compute-0 nova_compute[187243]: 2025-12-03 00:22:58.356 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:22:58 compute-0 nova_compute[187243]: 2025-12-03 00:22:58.358 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:58 compute-0 sshd-session[221575]: Received disconnect from 45.78.219.95 port 34804:11: Bye Bye [preauth]
Dec 03 00:22:58 compute-0 sshd-session[221575]: Disconnected from invalid user vncuser 45.78.219.95 port 34804 [preauth]
Dec 03 00:22:58 compute-0 sshd-session[221573]: Received disconnect from 45.78.219.213 port 59806:11: Bye Bye [preauth]
Dec 03 00:22:58 compute-0 sshd-session[221573]: Disconnected from invalid user temp 45.78.219.213 port 59806 [preauth]
Dec 03 00:22:59 compute-0 nova_compute[187243]: 2025-12-03 00:22:59.466 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:22:59 compute-0 nova_compute[187243]: 2025-12-03 00:22:59.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:22:59 compute-0 podman[197600]: time="2025-12-03T00:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:22:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:22:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Dec 03 00:23:00 compute-0 podman[221581]: 2025-12-03 00:23:00.101355855 +0000 UTC m=+0.055212144 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:23:00 compute-0 podman[221582]: 2025-12-03 00:23:00.102099153 +0000 UTC m=+0.052395005 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7, architecture=x86_64, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:23:00 compute-0 nova_compute[187243]: 2025-12-03 00:23:00.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:00.725 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:00.725 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:00.725 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:23:01 compute-0 openstack_network_exporter[199746]: ERROR   00:23:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:23:01 compute-0 openstack_network_exporter[199746]: ERROR   00:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:23:01 compute-0 openstack_network_exporter[199746]: ERROR   00:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:23:01 compute-0 openstack_network_exporter[199746]: ERROR   00:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:23:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:23:01 compute-0 openstack_network_exporter[199746]: ERROR   00:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:23:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:23:01 compute-0 nova_compute[187243]: 2025-12-03 00:23:01.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:01 compute-0 nova_compute[187243]: 2025-12-03 00:23:01.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:23:02 compute-0 nova_compute[187243]: 2025-12-03 00:23:02.106 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:23:02 compute-0 nova_compute[187243]: 2025-12-03 00:23:02.107 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:02 compute-0 nova_compute[187243]: 2025-12-03 00:23:02.107 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:23:03 compute-0 nova_compute[187243]: 2025-12-03 00:23:03.359 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:04 compute-0 nova_compute[187243]: 2025-12-03 00:23:04.520 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:05 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:05.061 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:23:05 compute-0 nova_compute[187243]: 2025-12-03 00:23:05.615 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:08 compute-0 nova_compute[187243]: 2025-12-03 00:23:08.361 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:09 compute-0 nova_compute[187243]: 2025-12-03 00:23:09.520 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:12 compute-0 podman[221623]: 2025-12-03 00:23:12.119262688 +0000 UTC m=+0.061610050 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:23:12 compute-0 nova_compute[187243]: 2025-12-03 00:23:12.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:13 compute-0 nova_compute[187243]: 2025-12-03 00:23:13.363 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:14 compute-0 nova_compute[187243]: 2025-12-03 00:23:14.551 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:15 compute-0 podman[221647]: 2025-12-03 00:23:15.102918421 +0000 UTC m=+0.055000739 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 03 00:23:15 compute-0 podman[221648]: 2025-12-03 00:23:15.15065889 +0000 UTC m=+0.095535551 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 03 00:23:18 compute-0 sshd-session[221690]: Received disconnect from 23.95.37.90 port 40128:11: Bye Bye [preauth]
Dec 03 00:23:18 compute-0 sshd-session[221690]: Disconnected from authenticating user root 23.95.37.90 port 40128 [preauth]
Dec 03 00:23:18 compute-0 nova_compute[187243]: 2025-12-03 00:23:18.400 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:19 compute-0 nova_compute[187243]: 2025-12-03 00:23:19.592 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:22 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:22.134 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9c:f9:e6 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '714680a21a7947948f824493a7b261e0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=45446e36-d2c9-4ea6-b9fb-83e2711350dd) old=Port_Binding(mac=['fa:16:3e:9c:f9:e6'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '714680a21a7947948f824493a7b261e0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:23:22 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:22.135 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 45446e36-d2c9-4ea6-b9fb-83e2711350dd in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 updated
Dec 03 00:23:22 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:22.136 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:23:22 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:22.136 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[12d22209-724a-4883-967a-e96f433c0980]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:23 compute-0 nova_compute[187243]: 2025-12-03 00:23:23.402 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:24 compute-0 nova_compute[187243]: 2025-12-03 00:23:24.638 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:28 compute-0 nova_compute[187243]: 2025-12-03 00:23:28.454 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:29 compute-0 nova_compute[187243]: 2025-12-03 00:23:29.698 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:29 compute-0 podman[197600]: time="2025-12-03T00:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:23:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:23:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Dec 03 00:23:31 compute-0 podman[221694]: 2025-12-03 00:23:31.090280815 +0000 UTC m=+0.050333384 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 03 00:23:31 compute-0 podman[221695]: 2025-12-03 00:23:31.094396876 +0000 UTC m=+0.048404667 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 03 00:23:31 compute-0 openstack_network_exporter[199746]: ERROR   00:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:23:31 compute-0 openstack_network_exporter[199746]: ERROR   00:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:23:31 compute-0 openstack_network_exporter[199746]: ERROR   00:23:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:23:31 compute-0 openstack_network_exporter[199746]: ERROR   00:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:23:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:23:31 compute-0 openstack_network_exporter[199746]: ERROR   00:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:23:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:23:32 compute-0 sshd-session[221692]: Received disconnect from 102.210.148.92 port 35954:11: Bye Bye [preauth]
Dec 03 00:23:32 compute-0 sshd-session[221692]: Disconnected from authenticating user root 102.210.148.92 port 35954 [preauth]
Dec 03 00:23:33 compute-0 nova_compute[187243]: 2025-12-03 00:23:33.457 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:33.691 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:a4:ad 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cb480f63-2911-490a-aba2-8454934ba6e8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cb480f63-2911-490a-aba2-8454934ba6e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=705ffc34-85ae-4eb2-b23d-c0cdb18a4c59, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=feb85889-8253-4b4d-b822-af965338aa22) old=Port_Binding(mac=['fa:16:3e:87:a4:ad'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-cb480f63-2911-490a-aba2-8454934ba6e8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cb480f63-2911-490a-aba2-8454934ba6e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:23:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:33.692 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port feb85889-8253-4b4d-b822-af965338aa22 in datapath cb480f63-2911-490a-aba2-8454934ba6e8 updated
Dec 03 00:23:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:33.693 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cb480f63-2911-490a-aba2-8454934ba6e8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:23:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:33.694 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2687df9b-c7d8-46b7-93d5-b552ab74d62a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:34 compute-0 nova_compute[187243]: 2025-12-03 00:23:34.701 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:35 compute-0 sshd-session[221734]: Connection closed by 101.47.140.127 port 40174 [preauth]
Dec 03 00:23:37 compute-0 ovn_controller[95488]: 2025-12-03T00:23:37Z|00226|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 03 00:23:38 compute-0 nova_compute[187243]: 2025-12-03 00:23:38.459 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:39 compute-0 nova_compute[187243]: 2025-12-03 00:23:39.703 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:41 compute-0 nova_compute[187243]: 2025-12-03 00:23:41.828 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:41 compute-0 nova_compute[187243]: 2025-12-03 00:23:41.828 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:42 compute-0 nova_compute[187243]: 2025-12-03 00:23:42.333 187247 DEBUG nova.compute.manager [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:23:43 compute-0 podman[221736]: 2025-12-03 00:23:43.103447923 +0000 UTC m=+0.058133475 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:23:43 compute-0 nova_compute[187243]: 2025-12-03 00:23:43.462 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:43 compute-0 nova_compute[187243]: 2025-12-03 00:23:43.702 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:43 compute-0 nova_compute[187243]: 2025-12-03 00:23:43.703 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:43 compute-0 nova_compute[187243]: 2025-12-03 00:23:43.711 187247 DEBUG nova.virt.hardware [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:23:43 compute-0 nova_compute[187243]: 2025-12-03 00:23:43.711 187247 INFO nova.compute.claims [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:23:44 compute-0 nova_compute[187243]: 2025-12-03 00:23:44.745 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:44 compute-0 nova_compute[187243]: 2025-12-03 00:23:44.787 187247 DEBUG nova.compute.provider_tree [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:23:45 compute-0 nova_compute[187243]: 2025-12-03 00:23:45.295 187247 DEBUG nova.scheduler.client.report [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:23:45 compute-0 nova_compute[187243]: 2025-12-03 00:23:45.809 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:23:45 compute-0 nova_compute[187243]: 2025-12-03 00:23:45.810 187247 DEBUG nova.compute.manager [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:23:46 compute-0 podman[221762]: 2025-12-03 00:23:46.117070169 +0000 UTC m=+0.074574418 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:23:46 compute-0 podman[221763]: 2025-12-03 00:23:46.140309059 +0000 UTC m=+0.094798534 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 03 00:23:46 compute-0 nova_compute[187243]: 2025-12-03 00:23:46.319 187247 DEBUG nova.compute.manager [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:23:46 compute-0 nova_compute[187243]: 2025-12-03 00:23:46.320 187247 DEBUG nova.network.neutron [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:23:46 compute-0 nova_compute[187243]: 2025-12-03 00:23:46.320 187247 WARNING neutronclient.v2_0.client [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:23:46 compute-0 nova_compute[187243]: 2025-12-03 00:23:46.320 187247 WARNING neutronclient.v2_0.client [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:23:46 compute-0 nova_compute[187243]: 2025-12-03 00:23:46.832 187247 INFO nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:23:47 compute-0 nova_compute[187243]: 2025-12-03 00:23:47.208 187247 DEBUG nova.network.neutron [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Successfully created port: d8c14b2b-88f1-46e9-af74-d11479fced60 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:23:47 compute-0 nova_compute[187243]: 2025-12-03 00:23:47.345 187247 DEBUG nova.compute.manager [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:23:48 compute-0 sshd-session[221807]: Invalid user exx from 20.123.120.169 port 50090
Dec 03 00:23:48 compute-0 sshd-session[221807]: Received disconnect from 20.123.120.169 port 50090:11: Bye Bye [preauth]
Dec 03 00:23:48 compute-0 sshd-session[221807]: Disconnected from invalid user exx 20.123.120.169 port 50090 [preauth]
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.369 187247 DEBUG nova.compute.manager [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.370 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.371 187247 INFO nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Creating image(s)
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.371 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.372 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.372 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.373 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.376 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.377 187247 DEBUG oslo_concurrency.processutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.448 187247 DEBUG oslo_concurrency.processutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.449 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.450 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.451 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.453 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.454 187247 DEBUG oslo_concurrency.processutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.464 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.543 187247 DEBUG oslo_concurrency.processutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.544 187247 DEBUG oslo_concurrency.processutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.588 187247 DEBUG oslo_concurrency.processutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.589 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.590 187247 DEBUG oslo_concurrency.processutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.646 187247 DEBUG oslo_concurrency.processutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.648 187247 DEBUG nova.virt.disk.api [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Checking if we can resize image /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.648 187247 DEBUG oslo_concurrency.processutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.704 187247 DEBUG oslo_concurrency.processutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.705 187247 DEBUG nova.virt.disk.api [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Cannot resize image /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.705 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.705 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Ensure instance console log exists: /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.706 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.706 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.706 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:23:48 compute-0 nova_compute[187243]: 2025-12-03 00:23:48.943 187247 DEBUG nova.network.neutron [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Successfully updated port: d8c14b2b-88f1-46e9-af74-d11479fced60 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:23:49 compute-0 nova_compute[187243]: 2025-12-03 00:23:49.030 187247 DEBUG nova.compute.manager [req-4348059e-7b38-40d4-b807-9d0707da3f28 req-d185ceb2-f43c-40ae-a60f-20ab752c9713 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-changed-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:23:49 compute-0 nova_compute[187243]: 2025-12-03 00:23:49.030 187247 DEBUG nova.compute.manager [req-4348059e-7b38-40d4-b807-9d0707da3f28 req-d185ceb2-f43c-40ae-a60f-20ab752c9713 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Refreshing instance network info cache due to event network-changed-d8c14b2b-88f1-46e9-af74-d11479fced60. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:23:49 compute-0 nova_compute[187243]: 2025-12-03 00:23:49.031 187247 DEBUG oslo_concurrency.lockutils [req-4348059e-7b38-40d4-b807-9d0707da3f28 req-d185ceb2-f43c-40ae-a60f-20ab752c9713 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:23:49 compute-0 nova_compute[187243]: 2025-12-03 00:23:49.031 187247 DEBUG oslo_concurrency.lockutils [req-4348059e-7b38-40d4-b807-9d0707da3f28 req-d185ceb2-f43c-40ae-a60f-20ab752c9713 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:23:49 compute-0 nova_compute[187243]: 2025-12-03 00:23:49.031 187247 DEBUG nova.network.neutron [req-4348059e-7b38-40d4-b807-9d0707da3f28 req-d185ceb2-f43c-40ae-a60f-20ab752c9713 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Refreshing network info cache for port d8c14b2b-88f1-46e9-af74-d11479fced60 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:23:49 compute-0 nova_compute[187243]: 2025-12-03 00:23:49.450 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:23:49 compute-0 nova_compute[187243]: 2025-12-03 00:23:49.537 187247 WARNING neutronclient.v2_0.client [req-4348059e-7b38-40d4-b807-9d0707da3f28 req-d185ceb2-f43c-40ae-a60f-20ab752c9713 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:23:49 compute-0 nova_compute[187243]: 2025-12-03 00:23:49.745 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:50 compute-0 nova_compute[187243]: 2025-12-03 00:23:50.054 187247 DEBUG nova.network.neutron [req-4348059e-7b38-40d4-b807-9d0707da3f28 req-d185ceb2-f43c-40ae-a60f-20ab752c9713 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:23:50 compute-0 nova_compute[187243]: 2025-12-03 00:23:50.214 187247 DEBUG nova.network.neutron [req-4348059e-7b38-40d4-b807-9d0707da3f28 req-d185ceb2-f43c-40ae-a60f-20ab752c9713 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:23:50 compute-0 nova_compute[187243]: 2025-12-03 00:23:50.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:50 compute-0 nova_compute[187243]: 2025-12-03 00:23:50.722 187247 DEBUG oslo_concurrency.lockutils [req-4348059e-7b38-40d4-b807-9d0707da3f28 req-d185ceb2-f43c-40ae-a60f-20ab752c9713 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:23:50 compute-0 nova_compute[187243]: 2025-12-03 00:23:50.723 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquired lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:23:50 compute-0 nova_compute[187243]: 2025-12-03 00:23:50.723 187247 DEBUG nova.network.neutron [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:23:51 compute-0 nova_compute[187243]: 2025-12-03 00:23:51.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:52 compute-0 nova_compute[187243]: 2025-12-03 00:23:52.047 187247 DEBUG nova.network.neutron [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:23:52 compute-0 nova_compute[187243]: 2025-12-03 00:23:52.261 187247 WARNING neutronclient.v2_0.client [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:23:52 compute-0 nova_compute[187243]: 2025-12-03 00:23:52.486 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:52 compute-0 nova_compute[187243]: 2025-12-03 00:23:52.998 187247 WARNING nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.
Dec 03 00:23:52 compute-0 nova_compute[187243]: 2025-12-03 00:23:52.999 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Triggering sync for uuid 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.000 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.068 187247 DEBUG nova.network.neutron [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Updating instance_info_cache with network_info: [{"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.467 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.576 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Releasing lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.577 187247 DEBUG nova.compute.manager [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Instance network_info: |[{"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.579 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Start _get_guest_xml network_info=[{"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.582 187247 WARNING nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.583 187247 DEBUG nova.virt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-802250903', uuid='89b22e0d-2f57-40f3-8c02-38af8f0ac9ab'), owner=OwnerMeta(userid='43c8524f2d244e8aa3019dd878dcfb81', username='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin', projectid='a8545a5c94f84697a8605fadf08251f7', projectname='tempest-TestExecuteZoneMigrationStrategy-558903593'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764721433.5834606) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.588 187247 DEBUG nova.virt.libvirt.host [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.589 187247 DEBUG nova.virt.libvirt.host [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.593 187247 DEBUG nova.virt.libvirt.host [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.593 187247 DEBUG nova.virt.libvirt.host [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.595 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.595 187247 DEBUG nova.virt.hardware [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.595 187247 DEBUG nova.virt.hardware [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.595 187247 DEBUG nova.virt.hardware [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.596 187247 DEBUG nova.virt.hardware [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.596 187247 DEBUG nova.virt.hardware [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.596 187247 DEBUG nova.virt.hardware [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.596 187247 DEBUG nova.virt.hardware [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.596 187247 DEBUG nova.virt.hardware [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.597 187247 DEBUG nova.virt.hardware [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.597 187247 DEBUG nova.virt.hardware [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.597 187247 DEBUG nova.virt.hardware [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.601 187247 DEBUG nova.virt.libvirt.vif [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:23:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-802250903',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-802250903',id=30,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-s8j6lm3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:23:47Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=89b22e0d-2f57-40f3-8c02-38af8f0ac9ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.601 187247 DEBUG nova.network.os_vif_util [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converting VIF {"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.602 187247 DEBUG nova.network.os_vif_util [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:23:53 compute-0 nova_compute[187243]: 2025-12-03 00:23:53.603 187247 DEBUG nova.objects.instance [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.111 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:23:54 compute-0 nova_compute[187243]:   <uuid>89b22e0d-2f57-40f3-8c02-38af8f0ac9ab</uuid>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   <name>instance-0000001e</name>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-802250903</nova:name>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:23:53</nova:creationTime>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:23:54 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:23:54 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:user uuid="43c8524f2d244e8aa3019dd878dcfb81">tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin</nova:user>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:project uuid="a8545a5c94f84697a8605fadf08251f7">tempest-TestExecuteZoneMigrationStrategy-558903593</nova:project>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         <nova:port uuid="d8c14b2b-88f1-46e9-af74-d11479fced60">
Dec 03 00:23:54 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <system>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <entry name="serial">89b22e0d-2f57-40f3-8c02-38af8f0ac9ab</entry>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <entry name="uuid">89b22e0d-2f57-40f3-8c02-38af8f0ac9ab</entry>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     </system>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   <os>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   </os>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   <features>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   </features>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.config"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:ec:0e:e1"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <target dev="tapd8c14b2b-88"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/console.log" append="off"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <video>
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     </video>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:23:54 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:23:54 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:23:54 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:23:54 compute-0 nova_compute[187243]: </domain>
Dec 03 00:23:54 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.112 187247 DEBUG nova.compute.manager [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Preparing to wait for external event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.112 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.112 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.112 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.113 187247 DEBUG nova.virt.libvirt.vif [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:23:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-802250903',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-802250903',id=30,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-s8j6lm3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:23:47Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=89b22e0d-2f57-40f3-8c02-38af8f0ac9ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.113 187247 DEBUG nova.network.os_vif_util [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converting VIF {"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.114 187247 DEBUG nova.network.os_vif_util [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.114 187247 DEBUG os_vif [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.114 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.115 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.115 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.116 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.116 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '46e46b79-bfc8-5b87-8e55-d131c9e691f8', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.119 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.121 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.121 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8c14b2b-88, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.122 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd8c14b2b-88, col_values=(('qos', UUID('83cb48ed-8702-48f5-aaa7-a5ddeba51301')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.122 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd8c14b2b-88, col_values=(('external_ids', {'iface-id': 'd8c14b2b-88f1-46e9-af74-d11479fced60', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ec:0e:e1', 'vm-uuid': '89b22e0d-2f57-40f3-8c02-38af8f0ac9ab'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.123 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:54 compute-0 NetworkManager[55671]: <info>  [1764721434.1239] manager: (tapd8c14b2b-88): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.125 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.128 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.128 187247 INFO os_vif [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88')
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:23:54 compute-0 nova_compute[187243]: 2025-12-03 00:23:54.748 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:55 compute-0 nova_compute[187243]: 2025-12-03 00:23:55.106 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:55 compute-0 nova_compute[187243]: 2025-12-03 00:23:55.107 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:55 compute-0 nova_compute[187243]: 2025-12-03 00:23:55.107 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:23:55 compute-0 nova_compute[187243]: 2025-12-03 00:23:55.107 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:23:55 compute-0 nova_compute[187243]: 2025-12-03 00:23:55.811 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:23:55 compute-0 nova_compute[187243]: 2025-12-03 00:23:55.811 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:23:55 compute-0 nova_compute[187243]: 2025-12-03 00:23:55.811 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] No VIF found with MAC fa:16:3e:ec:0e:e1, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:23:55 compute-0 nova_compute[187243]: 2025-12-03 00:23:55.812 187247 INFO nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Using config drive
Dec 03 00:23:56 compute-0 nova_compute[187243]: 2025-12-03 00:23:56.140 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:23:56 compute-0 nova_compute[187243]: 2025-12-03 00:23:56.191 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:23:56 compute-0 nova_compute[187243]: 2025-12-03 00:23:56.192 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:23:56 compute-0 nova_compute[187243]: 2025-12-03 00:23:56.248 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:23:56 compute-0 nova_compute[187243]: 2025-12-03 00:23:56.249 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-0000001e, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.config'
Dec 03 00:23:56 compute-0 nova_compute[187243]: 2025-12-03 00:23:56.326 187247 WARNING neutronclient.v2_0.client [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:23:56 compute-0 nova_compute[187243]: 2025-12-03 00:23:56.373 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:23:56 compute-0 nova_compute[187243]: 2025-12-03 00:23:56.374 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:23:56 compute-0 nova_compute[187243]: 2025-12-03 00:23:56.391 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:23:56 compute-0 nova_compute[187243]: 2025-12-03 00:23:56.391 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5831MB free_disk=73.16205978393555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:23:56 compute-0 nova_compute[187243]: 2025-12-03 00:23:56.392 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:56 compute-0 nova_compute[187243]: 2025-12-03 00:23:56.392 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:57 compute-0 nova_compute[187243]: 2025-12-03 00:23:57.480 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:23:57 compute-0 nova_compute[187243]: 2025-12-03 00:23:57.480 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:23:57 compute-0 nova_compute[187243]: 2025-12-03 00:23:57.481 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:23:56 up  1:32,  0 user,  load average: 0.07, 0.23, 0.26\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_a8545a5c94f84697a8605fadf08251f7': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:23:57 compute-0 nova_compute[187243]: 2025-12-03 00:23:57.553 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:23:57 compute-0 nova_compute[187243]: 2025-12-03 00:23:57.974 187247 INFO nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Creating config drive at /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.config
Dec 03 00:23:57 compute-0 nova_compute[187243]: 2025-12-03 00:23:57.980 187247 DEBUG oslo_concurrency.processutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpidq7ub0j execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.104 187247 DEBUG oslo_concurrency.processutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpidq7ub0j" returned: 0 in 0.124s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:23:58 compute-0 kernel: tapd8c14b2b-88: entered promiscuous mode
Dec 03 00:23:58 compute-0 NetworkManager[55671]: <info>  [1764721438.1524] manager: (tapd8c14b2b-88): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Dec 03 00:23:58 compute-0 ovn_controller[95488]: 2025-12-03T00:23:58Z|00227|binding|INFO|Claiming lport d8c14b2b-88f1-46e9-af74-d11479fced60 for this chassis.
Dec 03 00:23:58 compute-0 ovn_controller[95488]: 2025-12-03T00:23:58Z|00228|binding|INFO|d8c14b2b-88f1-46e9-af74-d11479fced60: Claiming fa:16:3e:ec:0e:e1 10.100.0.3
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.154 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:58 compute-0 systemd-udevd[221849]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:23:58 compute-0 NetworkManager[55671]: <info>  [1764721438.1891] device (tapd8c14b2b-88): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:23:58 compute-0 NetworkManager[55671]: <info>  [1764721438.1903] device (tapd8c14b2b-88): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:23:58 compute-0 systemd-machined[153518]: New machine qemu-21-instance-0000001e.
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.208 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.213 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:58 compute-0 ovn_controller[95488]: 2025-12-03T00:23:58Z|00229|binding|INFO|Setting lport d8c14b2b-88f1-46e9-af74-d11479fced60 ovn-installed in OVS
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.217 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:58 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000001e.
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.321 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:23:58 compute-0 ovn_controller[95488]: 2025-12-03T00:23:58Z|00230|binding|INFO|Setting lport d8c14b2b-88f1-46e9-af74-d11479fced60 up in Southbound
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.585 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:0e:e1 10.100.0.3'], port_security=['fa:16:3e:ec:0e:e1 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '89b22e0d-2f57-40f3-8c02-38af8f0ac9ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '85b55f5e-0cbc-47d6-baaa-5c5f70692f0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=d8c14b2b-88f1-46e9-af74-d11479fced60) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.586 104379 INFO neutron.agent.ovn.metadata.agent [-] Port d8c14b2b-88f1-46e9-af74-d11479fced60 in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 bound to our chassis
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.587 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.597 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4760dc60-750b-43cd-a63a-a87bcf33c888]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.598 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf7a76663-51 in ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.599 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf7a76663-50 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.600 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5e381114-86cc-456a-b63c-741b3d323863]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.600 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[56272845-e07f-4dc5-b7ce-18958b830ef1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.612 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[9373fe10-9200-4951-882c-9fc32399fe58]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.619 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[93dc6922-56ea-4ef6-8045-b0c7530bdd2e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.643 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[1f531a72-1bb9-43e4-9f3d-04a9a0379b9d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 NetworkManager[55671]: <info>  [1764721438.6515] manager: (tapf7a76663-50): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.650 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[07fb4d86-df25-4ef5-a9d4-03639ac971e6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 systemd-udevd[221853]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.685 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[f3e02b36-0212-4902-861f-3cae69f9ab6c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.688 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[afeafb6e-bdb6-41da-8e08-9da1b8a50cb5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 NetworkManager[55671]: <info>  [1764721438.7134] device (tapf7a76663-50): carrier: link connected
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.720 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[9fc71e6c-e340-49d8-99b5-bb8c9c09103c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.739 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5bfa925c-4235-41b4-a34d-86eb46f98bc8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7a76663-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:f9:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552935, 'reachable_time': 44366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221893, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.756 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[70b49738-f03d-4adc-8bf2-3382b8bd1645]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9c:f9e6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 552935, 'tstamp': 552935}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221894, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.774 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[63db8794-b64e-4a4d-9971-6cfdb65ea347]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7a76663-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:f9:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552935, 'reachable_time': 44366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221895, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.803 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[495d8598-4180-4aa4-80b6-84a27be2ee19]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.855 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8b120d6a-faab-4df1-8754-88e63bd6c81c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.856 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a76663-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.856 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.856 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7a76663-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.858 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:58 compute-0 kernel: tapf7a76663-50: entered promiscuous mode
Dec 03 00:23:58 compute-0 NetworkManager[55671]: <info>  [1764721438.8589] manager: (tapf7a76663-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.860 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7a76663-50, col_values=(('external_ids', {'iface-id': '45446e36-d2c9-4ea6-b9fb-83e2711350dd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.861 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:58 compute-0 ovn_controller[95488]: 2025-12-03T00:23:58Z|00231|binding|INFO|Releasing lport 45446e36-d2c9-4ea6-b9fb-83e2711350dd from this chassis (sb_readonly=0)
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.862 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.863 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7a331487-85fc-497b-aac0-b3d1218f3774]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.864 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.864 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.864 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.864 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.865 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f1343641-e6e0-48b8-9214-85517127bde5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.865 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.866 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0559203a-b3d2-4963-b5a5-21a0c615e4dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.866 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: global
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: defaults
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     log global
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.867 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'env', 'PROCESS_TAG=haproxy-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.873 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.910 187247 DEBUG nova.compute.manager [req-5ac4474f-d73b-44ce-b1ef-bc967d12e756 req-f6eadc26-9722-4229-a72d-8a1ed12f1849 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.910 187247 DEBUG oslo_concurrency.lockutils [req-5ac4474f-d73b-44ce-b1ef-bc967d12e756 req-f6eadc26-9722-4229-a72d-8a1ed12f1849 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.911 187247 DEBUG oslo_concurrency.lockutils [req-5ac4474f-d73b-44ce-b1ef-bc967d12e756 req-f6eadc26-9722-4229-a72d-8a1ed12f1849 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.911 187247 DEBUG oslo_concurrency.lockutils [req-5ac4474f-d73b-44ce-b1ef-bc967d12e756 req-f6eadc26-9722-4229-a72d-8a1ed12f1849 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.911 187247 DEBUG nova.compute.manager [req-5ac4474f-d73b-44ce-b1ef-bc967d12e756 req-f6eadc26-9722-4229-a72d-8a1ed12f1849 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Processing event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.911 187247 DEBUG nova.compute.manager [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.917 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.921 187247 INFO nova.virt.libvirt.driver [-] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Instance spawned successfully.
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.921 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:23:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:58.960 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:23:58 compute-0 nova_compute[187243]: 2025-12-03 00:23:58.961 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:59 compute-0 nova_compute[187243]: 2025-12-03 00:23:59.164 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:59 compute-0 nova_compute[187243]: 2025-12-03 00:23:59.216 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:23:59 compute-0 nova_compute[187243]: 2025-12-03 00:23:59.217 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.825s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:23:59 compute-0 podman[221927]: 2025-12-03 00:23:59.223930677 +0000 UTC m=+0.048773565 container create f42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 03 00:23:59 compute-0 systemd[1]: Started libpod-conmon-f42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd.scope.
Dec 03 00:23:59 compute-0 systemd[1]: Started libcrun container.
Dec 03 00:23:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/160bdea42f5ee088c35a5f6255f25dc2be590880cdb5ad5967f7d724d5fb9237/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:23:59 compute-0 podman[221927]: 2025-12-03 00:23:59.19913661 +0000 UTC m=+0.023979518 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:23:59 compute-0 podman[221927]: 2025-12-03 00:23:59.30039204 +0000 UTC m=+0.125234978 container init f42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:23:59 compute-0 podman[221927]: 2025-12-03 00:23:59.305591057 +0000 UTC m=+0.130433965 container start f42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 03 00:23:59 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221943]: [NOTICE]   (221947) : New worker (221949) forked
Dec 03 00:23:59 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221943]: [NOTICE]   (221947) : Loading success.
Dec 03 00:23:59 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:23:59.379 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:23:59 compute-0 nova_compute[187243]: 2025-12-03 00:23:59.440 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:23:59 compute-0 nova_compute[187243]: 2025-12-03 00:23:59.441 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:23:59 compute-0 nova_compute[187243]: 2025-12-03 00:23:59.441 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:23:59 compute-0 nova_compute[187243]: 2025-12-03 00:23:59.442 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:23:59 compute-0 nova_compute[187243]: 2025-12-03 00:23:59.442 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:23:59 compute-0 nova_compute[187243]: 2025-12-03 00:23:59.443 187247 DEBUG nova.virt.libvirt.driver [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:23:59 compute-0 podman[197600]: time="2025-12-03T00:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:23:59 compute-0 nova_compute[187243]: 2025-12-03 00:23:59.750 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:23:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:23:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Dec 03 00:23:59 compute-0 nova_compute[187243]: 2025-12-03 00:23:59.992 187247 INFO nova.compute.manager [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Took 11.62 seconds to spawn the instance on the hypervisor.
Dec 03 00:23:59 compute-0 nova_compute[187243]: 2025-12-03 00:23:59.993 187247 DEBUG nova.compute.manager [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:24:00 compute-0 nova_compute[187243]: 2025-12-03 00:24:00.525 187247 INFO nova.compute.manager [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Took 16.87 seconds to build instance.
Dec 03 00:24:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:00.726 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:00.726 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:00.727 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:00 compute-0 nova_compute[187243]: 2025-12-03 00:24:00.991 187247 DEBUG nova.compute.manager [req-7a517069-b69c-4864-a7b7-8c56b795e282 req-9c4fe83c-29fa-41cd-9f3d-e34492e30d98 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:24:00 compute-0 nova_compute[187243]: 2025-12-03 00:24:00.992 187247 DEBUG oslo_concurrency.lockutils [req-7a517069-b69c-4864-a7b7-8c56b795e282 req-9c4fe83c-29fa-41cd-9f3d-e34492e30d98 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:00 compute-0 nova_compute[187243]: 2025-12-03 00:24:00.992 187247 DEBUG oslo_concurrency.lockutils [req-7a517069-b69c-4864-a7b7-8c56b795e282 req-9c4fe83c-29fa-41cd-9f3d-e34492e30d98 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:00 compute-0 nova_compute[187243]: 2025-12-03 00:24:00.992 187247 DEBUG oslo_concurrency.lockutils [req-7a517069-b69c-4864-a7b7-8c56b795e282 req-9c4fe83c-29fa-41cd-9f3d-e34492e30d98 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:00 compute-0 nova_compute[187243]: 2025-12-03 00:24:00.992 187247 DEBUG nova.compute.manager [req-7a517069-b69c-4864-a7b7-8c56b795e282 req-9c4fe83c-29fa-41cd-9f3d-e34492e30d98 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] No waiting events found dispatching network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:24:00 compute-0 nova_compute[187243]: 2025-12-03 00:24:00.993 187247 WARNING nova.compute.manager [req-7a517069-b69c-4864-a7b7-8c56b795e282 req-9c4fe83c-29fa-41cd-9f3d-e34492e30d98 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received unexpected event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 for instance with vm_state active and task_state None.
Dec 03 00:24:01 compute-0 nova_compute[187243]: 2025-12-03 00:24:01.034 187247 DEBUG oslo_concurrency.lockutils [None req-e5ded702-8981-480f-b227-9adfe03547f3 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.206s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:01 compute-0 nova_compute[187243]: 2025-12-03 00:24:01.035 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 8.036s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:01 compute-0 nova_compute[187243]: 2025-12-03 00:24:01.035 187247 INFO nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 03 00:24:01 compute-0 nova_compute[187243]: 2025-12-03 00:24:01.036 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:01 compute-0 openstack_network_exporter[199746]: ERROR   00:24:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:24:01 compute-0 openstack_network_exporter[199746]: ERROR   00:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:24:01 compute-0 openstack_network_exporter[199746]: ERROR   00:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:24:01 compute-0 openstack_network_exporter[199746]: ERROR   00:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:24:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:24:01 compute-0 openstack_network_exporter[199746]: ERROR   00:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:24:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:24:02 compute-0 podman[221960]: 2025-12-03 00:24:02.098665352 +0000 UTC m=+0.057969061 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Dec 03 00:24:02 compute-0 podman[221961]: 2025-12-03 00:24:02.106147525 +0000 UTC m=+0.062218015 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 03 00:24:02 compute-0 nova_compute[187243]: 2025-12-03 00:24:02.217 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:02 compute-0 nova_compute[187243]: 2025-12-03 00:24:02.217 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:02 compute-0 nova_compute[187243]: 2025-12-03 00:24:02.217 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:02 compute-0 nova_compute[187243]: 2025-12-03 00:24:02.217 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:02 compute-0 nova_compute[187243]: 2025-12-03 00:24:02.218 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:24:04 compute-0 nova_compute[187243]: 2025-12-03 00:24:04.167 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:04 compute-0 nova_compute[187243]: 2025-12-03 00:24:04.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:04 compute-0 nova_compute[187243]: 2025-12-03 00:24:04.753 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:07 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:07.381 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:09 compute-0 nova_compute[187243]: 2025-12-03 00:24:09.170 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:09 compute-0 nova_compute[187243]: 2025-12-03 00:24:09.755 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:12 compute-0 ovn_controller[95488]: 2025-12-03T00:24:12Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ec:0e:e1 10.100.0.3
Dec 03 00:24:12 compute-0 ovn_controller[95488]: 2025-12-03T00:24:12Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ec:0e:e1 10.100.0.3
Dec 03 00:24:14 compute-0 podman[222011]: 2025-12-03 00:24:14.111436708 +0000 UTC m=+0.060071512 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:24:14 compute-0 nova_compute[187243]: 2025-12-03 00:24:14.172 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:14 compute-0 nova_compute[187243]: 2025-12-03 00:24:14.797 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:17 compute-0 podman[222035]: 2025-12-03 00:24:17.100288489 +0000 UTC m=+0.060659047 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 03 00:24:17 compute-0 podman[222036]: 2025-12-03 00:24:17.134526847 +0000 UTC m=+0.089488452 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Dec 03 00:24:19 compute-0 nova_compute[187243]: 2025-12-03 00:24:19.191 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:19 compute-0 nova_compute[187243]: 2025-12-03 00:24:19.799 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:23 compute-0 sshd-session[222078]: Received disconnect from 61.220.235.10 port 52300:11: Bye Bye [preauth]
Dec 03 00:24:23 compute-0 sshd-session[222078]: Disconnected from authenticating user root 61.220.235.10 port 52300 [preauth]
Dec 03 00:24:24 compute-0 nova_compute[187243]: 2025-12-03 00:24:24.193 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:24 compute-0 nova_compute[187243]: 2025-12-03 00:24:24.801 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:29 compute-0 nova_compute[187243]: 2025-12-03 00:24:29.194 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:29 compute-0 podman[197600]: time="2025-12-03T00:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:24:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:24:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3071 "" "Go-http-client/1.1"
Dec 03 00:24:29 compute-0 nova_compute[187243]: 2025-12-03 00:24:29.857 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:31 compute-0 openstack_network_exporter[199746]: ERROR   00:24:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:24:31 compute-0 openstack_network_exporter[199746]: ERROR   00:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:24:31 compute-0 openstack_network_exporter[199746]: ERROR   00:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:24:31 compute-0 openstack_network_exporter[199746]: ERROR   00:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:24:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:24:31 compute-0 openstack_network_exporter[199746]: ERROR   00:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:24:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:24:33 compute-0 podman[222081]: 2025-12-03 00:24:33.105340468 +0000 UTC m=+0.056271940 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, version=9.6, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git)
Dec 03 00:24:33 compute-0 podman[222080]: 2025-12-03 00:24:33.13972151 +0000 UTC m=+0.092977549 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 03 00:24:34 compute-0 nova_compute[187243]: 2025-12-03 00:24:34.197 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:34 compute-0 nova_compute[187243]: 2025-12-03 00:24:34.924 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:35 compute-0 nova_compute[187243]: 2025-12-03 00:24:35.635 187247 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Check if temp file /var/lib/nova/instances/tmpve2lxt8_ exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:24:35 compute-0 nova_compute[187243]: 2025-12-03 00:24:35.640 187247 DEBUG nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpve2lxt8_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='89b22e0d-2f57-40f3-8c02-38af8f0ac9ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:24:39 compute-0 nova_compute[187243]: 2025-12-03 00:24:39.199 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:39 compute-0 nova_compute[187243]: 2025-12-03 00:24:39.927 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:40 compute-0 nova_compute[187243]: 2025-12-03 00:24:40.487 187247 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:40 compute-0 nova_compute[187243]: 2025-12-03 00:24:40.540 187247 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:40 compute-0 nova_compute[187243]: 2025-12-03 00:24:40.542 187247 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:40 compute-0 nova_compute[187243]: 2025-12-03 00:24:40.624 187247 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:40 compute-0 nova_compute[187243]: 2025-12-03 00:24:40.625 187247 DEBUG nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Preparing to wait for external event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:24:40 compute-0 nova_compute[187243]: 2025-12-03 00:24:40.626 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:40 compute-0 nova_compute[187243]: 2025-12-03 00:24:40.626 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:40 compute-0 nova_compute[187243]: 2025-12-03 00:24:40.626 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:44 compute-0 nova_compute[187243]: 2025-12-03 00:24:44.201 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:44 compute-0 nova_compute[187243]: 2025-12-03 00:24:44.961 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:45 compute-0 podman[222128]: 2025-12-03 00:24:45.129137085 +0000 UTC m=+0.078792931 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 03 00:24:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:47.999 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:24:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:48.000 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:24:48 compute-0 nova_compute[187243]: 2025-12-03 00:24:48.001 187247 DEBUG nova.compute.manager [req-70f4244d-292a-41b8-a37d-8d2bd43072bb req-8e34d13c-f697-440f-9aa4-43194177e02a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:24:48 compute-0 nova_compute[187243]: 2025-12-03 00:24:48.001 187247 DEBUG oslo_concurrency.lockutils [req-70f4244d-292a-41b8-a37d-8d2bd43072bb req-8e34d13c-f697-440f-9aa4-43194177e02a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:48 compute-0 nova_compute[187243]: 2025-12-03 00:24:48.002 187247 DEBUG oslo_concurrency.lockutils [req-70f4244d-292a-41b8-a37d-8d2bd43072bb req-8e34d13c-f697-440f-9aa4-43194177e02a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:48 compute-0 nova_compute[187243]: 2025-12-03 00:24:48.002 187247 DEBUG oslo_concurrency.lockutils [req-70f4244d-292a-41b8-a37d-8d2bd43072bb req-8e34d13c-f697-440f-9aa4-43194177e02a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:48 compute-0 nova_compute[187243]: 2025-12-03 00:24:48.002 187247 DEBUG nova.compute.manager [req-70f4244d-292a-41b8-a37d-8d2bd43072bb req-8e34d13c-f697-440f-9aa4-43194177e02a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] No event matching network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 in dict_keys([('network-vif-plugged', 'd8c14b2b-88f1-46e9-af74-d11479fced60')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:24:48 compute-0 nova_compute[187243]: 2025-12-03 00:24:48.003 187247 DEBUG nova.compute.manager [req-70f4244d-292a-41b8-a37d-8d2bd43072bb req-8e34d13c-f697-440f-9aa4-43194177e02a 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:24:48 compute-0 nova_compute[187243]: 2025-12-03 00:24:48.003 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:48 compute-0 podman[222154]: 2025-12-03 00:24:48.098837587 +0000 UTC m=+0.058074324 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 03 00:24:48 compute-0 podman[222155]: 2025-12-03 00:24:48.123624334 +0000 UTC m=+0.084647995 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller)
Dec 03 00:24:49 compute-0 nova_compute[187243]: 2025-12-03 00:24:49.204 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:49 compute-0 nova_compute[187243]: 2025-12-03 00:24:49.655 187247 INFO nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Took 9.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.003 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.071 187247 DEBUG nova.compute.manager [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.072 187247 DEBUG oslo_concurrency.lockutils [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.072 187247 DEBUG oslo_concurrency.lockutils [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.072 187247 DEBUG oslo_concurrency.lockutils [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.072 187247 DEBUG nova.compute.manager [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Processing event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.072 187247 DEBUG nova.compute.manager [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-changed-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.072 187247 DEBUG nova.compute.manager [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Refreshing instance network info cache due to event network-changed-d8c14b2b-88f1-46e9-af74-d11479fced60. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.073 187247 DEBUG oslo_concurrency.lockutils [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.073 187247 DEBUG oslo_concurrency.lockutils [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.073 187247 DEBUG nova.network.neutron [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Refreshing network info cache for port d8c14b2b-88f1-46e9-af74-d11479fced60 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.074 187247 DEBUG nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:24:50 compute-0 ovn_controller[95488]: 2025-12-03T00:24:50Z|00232|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.580 187247 WARNING neutronclient.v2_0.client [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:50 compute-0 nova_compute[187243]: 2025-12-03 00:24:50.586 187247 DEBUG nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpve2lxt8_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='89b22e0d-2f57-40f3-8c02-38af8f0ac9ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(de51b730-21c6-4368-a536-eef3863ee14c),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:24:51 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:51.002 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.098 187247 DEBUG nova.objects.instance [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.099 187247 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.100 187247 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.100 187247 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.452 187247 WARNING neutronclient.v2_0.client [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.603 187247 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.603 187247 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.609 187247 DEBUG nova.virt.libvirt.vif [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:23:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-802250903',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-802250903',id=30,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:23:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-s8j6lm3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:24:00Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=89b22e0d-2f57-40f3-8c02-38af8f0ac9ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.610 187247 DEBUG nova.network.os_vif_util [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.610 187247 DEBUG nova.network.os_vif_util [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.611 187247 DEBUG nova.virt.libvirt.migration [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:ec:0e:e1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <target dev="tapd8c14b2b-88"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]: </interface>
Dec 03 00:24:51 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.611 187247 DEBUG nova.virt.libvirt.migration [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <name>instance-0000001e</name>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <uuid>89b22e0d-2f57-40f3-8c02-38af8f0ac9ab</uuid>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-802250903</nova:name>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:23:53</nova:creationTime>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:24:51 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:24:51 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:user uuid="43c8524f2d244e8aa3019dd878dcfb81">tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin</nova:user>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:project uuid="a8545a5c94f84697a8605fadf08251f7">tempest-TestExecuteZoneMigrationStrategy-558903593</nova:project>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:port uuid="d8c14b2b-88f1-46e9-af74-d11479fced60">
Dec 03 00:24:51 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <system>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="serial">89b22e0d-2f57-40f3-8c02-38af8f0ac9ab</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="uuid">89b22e0d-2f57-40f3-8c02-38af8f0ac9ab</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </system>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <os>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </os>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <features>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </features>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.config"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:ec:0e:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd8c14b2b-88"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/console.log" append="off"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </target>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/console.log" append="off"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </console>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </input>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <video>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </video>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]: </domain>
Dec 03 00:24:51 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.613 187247 DEBUG nova.virt.libvirt.migration [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <name>instance-0000001e</name>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <uuid>89b22e0d-2f57-40f3-8c02-38af8f0ac9ab</uuid>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-802250903</nova:name>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:23:53</nova:creationTime>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:24:51 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:24:51 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:user uuid="43c8524f2d244e8aa3019dd878dcfb81">tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin</nova:user>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:project uuid="a8545a5c94f84697a8605fadf08251f7">tempest-TestExecuteZoneMigrationStrategy-558903593</nova:project>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:port uuid="d8c14b2b-88f1-46e9-af74-d11479fced60">
Dec 03 00:24:51 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <system>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="serial">89b22e0d-2f57-40f3-8c02-38af8f0ac9ab</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="uuid">89b22e0d-2f57-40f3-8c02-38af8f0ac9ab</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </system>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <os>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </os>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <features>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </features>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.config"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:ec:0e:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd8c14b2b-88"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/console.log" append="off"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </target>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/console.log" append="off"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </console>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </input>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <video>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </video>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]: </domain>
Dec 03 00:24:51 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.615 187247 DEBUG nova.virt.libvirt.migration [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <name>instance-0000001e</name>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <uuid>89b22e0d-2f57-40f3-8c02-38af8f0ac9ab</uuid>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-802250903</nova:name>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:23:53</nova:creationTime>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:24:51 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:24:51 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:user uuid="43c8524f2d244e8aa3019dd878dcfb81">tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin</nova:user>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:project uuid="a8545a5c94f84697a8605fadf08251f7">tempest-TestExecuteZoneMigrationStrategy-558903593</nova:project>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <nova:port uuid="d8c14b2b-88f1-46e9-af74-d11479fced60">
Dec 03 00:24:51 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <system>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="serial">89b22e0d-2f57-40f3-8c02-38af8f0ac9ab</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="uuid">89b22e0d-2f57-40f3-8c02-38af8f0ac9ab</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </system>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <os>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </os>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <features>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </features>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/disk.config"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:ec:0e:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd8c14b2b-88"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/console.log" append="off"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:24:51 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       </target>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab/console.log" append="off"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </console>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </input>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <video>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </video>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:24:51 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:24:51 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:24:51 compute-0 nova_compute[187243]: </domain>
Dec 03 00:24:51 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.616 187247 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.657 187247 DEBUG nova.network.neutron [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Updated VIF entry in instance network info cache for port d8c14b2b-88f1-46e9-af74-d11479fced60. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:24:51 compute-0 nova_compute[187243]: 2025-12-03 00:24:51.657 187247 DEBUG nova.network.neutron [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Updating instance_info_cache with network_info: [{"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:24:52 compute-0 nova_compute[187243]: 2025-12-03 00:24:52.106 187247 DEBUG nova.virt.libvirt.migration [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:24:52 compute-0 nova_compute[187243]: 2025-12-03 00:24:52.106 187247 INFO nova.virt.libvirt.migration [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:24:52 compute-0 nova_compute[187243]: 2025-12-03 00:24:52.164 187247 DEBUG oslo_concurrency.lockutils [req-f5c03f60-7abb-44ac-945c-281fe3d23ad2 req-1ffe1c66-f791-42d8-844a-7078964bca40 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-89b22e0d-2f57-40f3-8c02-38af8f0ac9ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:24:52 compute-0 nova_compute[187243]: 2025-12-03 00:24:52.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:53 compute-0 nova_compute[187243]: 2025-12-03 00:24:53.129 187247 INFO nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:24:53 compute-0 nova_compute[187243]: 2025-12-03 00:24:53.644 187247 DEBUG nova.virt.libvirt.migration [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:24:53 compute-0 nova_compute[187243]: 2025-12-03 00:24:53.645 187247 DEBUG nova.virt.libvirt.migration [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 03 00:24:53 compute-0 sshd-session[222206]: Invalid user casaos from 23.95.37.90 port 58718
Dec 03 00:24:53 compute-0 sshd-session[222206]: Received disconnect from 23.95.37.90 port 58718:11: Bye Bye [preauth]
Dec 03 00:24:53 compute-0 sshd-session[222206]: Disconnected from invalid user casaos 23.95.37.90 port 58718 [preauth]
Dec 03 00:24:53 compute-0 kernel: tapd8c14b2b-88 (unregistering): left promiscuous mode
Dec 03 00:24:53 compute-0 NetworkManager[55671]: <info>  [1764721493.7902] device (tapd8c14b2b-88): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:24:53 compute-0 nova_compute[187243]: 2025-12-03 00:24:53.798 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:53 compute-0 ovn_controller[95488]: 2025-12-03T00:24:53Z|00233|binding|INFO|Releasing lport d8c14b2b-88f1-46e9-af74-d11479fced60 from this chassis (sb_readonly=0)
Dec 03 00:24:53 compute-0 ovn_controller[95488]: 2025-12-03T00:24:53Z|00234|binding|INFO|Setting lport d8c14b2b-88f1-46e9-af74-d11479fced60 down in Southbound
Dec 03 00:24:53 compute-0 ovn_controller[95488]: 2025-12-03T00:24:53Z|00235|binding|INFO|Removing iface tapd8c14b2b-88 ovn-installed in OVS
Dec 03 00:24:53 compute-0 nova_compute[187243]: 2025-12-03 00:24:53.800 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:53.808 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:0e:e1 10.100.0.3'], port_security=['fa:16:3e:ec:0e:e1 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '89b22e0d-2f57-40f3-8c02-38af8f0ac9ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '10', 'neutron:security_group_ids': '85b55f5e-0cbc-47d6-baaa-5c5f70692f0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=d8c14b2b-88f1-46e9-af74-d11479fced60) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:24:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:53.809 104379 INFO neutron.agent.ovn.metadata.agent [-] Port d8c14b2b-88f1-46e9-af74-d11479fced60 in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 unbound from our chassis
Dec 03 00:24:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:53.811 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:24:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:53.812 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9bf18c-6e8a-4651-bfc2-64763d83b42d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:53 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:53.813 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 namespace which is not needed anymore
Dec 03 00:24:53 compute-0 nova_compute[187243]: 2025-12-03 00:24:53.822 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:53 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Dec 03 00:24:53 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001e.scope: Consumed 14.733s CPU time.
Dec 03 00:24:53 compute-0 systemd-machined[153518]: Machine qemu-21-instance-0000001e terminated.
Dec 03 00:24:53 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221943]: [NOTICE]   (221947) : haproxy version is 3.0.5-8e879a5
Dec 03 00:24:53 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221943]: [NOTICE]   (221947) : path to executable is /usr/sbin/haproxy
Dec 03 00:24:53 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221943]: [WARNING]  (221947) : Exiting Master process...
Dec 03 00:24:53 compute-0 podman[222234]: 2025-12-03 00:24:53.930850428 +0000 UTC m=+0.028790516 container kill f42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:24:53 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221943]: [ALERT]    (221947) : Current worker (221949) exited with code 143 (Terminated)
Dec 03 00:24:53 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[221943]: [WARNING]  (221947) : All workers exited. Exiting... (0)
Dec 03 00:24:53 compute-0 systemd[1]: libpod-f42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd.scope: Deactivated successfully.
Dec 03 00:24:53 compute-0 podman[222250]: 2025-12-03 00:24:53.97542309 +0000 UTC m=+0.026060989 container died f42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Dec 03 00:24:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd-userdata-shm.mount: Deactivated successfully.
Dec 03 00:24:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-160bdea42f5ee088c35a5f6255f25dc2be590880cdb5ad5967f7d724d5fb9237-merged.mount: Deactivated successfully.
Dec 03 00:24:54 compute-0 podman[222250]: 2025-12-03 00:24:54.009026634 +0000 UTC m=+0.059664523 container cleanup f42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Dec 03 00:24:54 compute-0 systemd[1]: libpod-conmon-f42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd.scope: Deactivated successfully.
Dec 03 00:24:54 compute-0 podman[222252]: 2025-12-03 00:24:54.027195889 +0000 UTC m=+0.071811461 container remove f42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.027 187247 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.027 187247 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.027 187247 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:24:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:54.031 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d8daf9ed-228e-4202-9898-b5692509c8b5]: (4, ("Wed Dec  3 12:24:53 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 (f42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd)\nf42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd\nWed Dec  3 12:24:53 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 (f42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd)\nf42e69c5430e6f7a0d876101b51d1e9d50f21ee3dcb416e33167a52046a67abd\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:54.033 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe00be0-f507-4b45-b0be-3405a9be695f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:54.033 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:24:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:54.034 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3626298e-b9b9-4418-acef-804edeb9874a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:54.034 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a76663-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.037 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:54 compute-0 kernel: tapf7a76663-50: left promiscuous mode
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.055 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:54.057 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1f61dc30-3716-4630-b450-a9a3eeb8331a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:54.078 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[70e30078-9b17-45d4-aebd-4ddfd69cb572]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:54.079 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4d9e3f7a-0772-456d-8c0c-1ed87948adf8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:54.095 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ce9f98a0-c9fd-49c9-b75a-c1dcb93fe5cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552928, 'reachable_time': 26847, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222299, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:54.097 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:24:54 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:24:54.097 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[4c13ad0e-9ca2-475f-8c52-bb3f7024fbde]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:24:54 compute-0 systemd[1]: run-netns-ovnmeta\x2df7a76663\x2d52a3\x2d4e8c\x2daf8a\x2d8ef26c8fecf2.mount: Deactivated successfully.
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.147 187247 DEBUG nova.virt.libvirt.guest [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '89b22e0d-2f57-40f3-8c02-38af8f0ac9ab' (instance-0000001e) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.148 187247 INFO nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Migration operation has completed
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.148 187247 INFO nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] _post_live_migration() is started..
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.162 187247 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.162 187247 WARNING neutronclient.v2_0.client [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.189 187247 DEBUG nova.compute.manager [req-2c535626-9ad4-4946-9ca4-2405eb204046 req-d9f2120f-5e87-4825-8306-210b9ef31284 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.190 187247 DEBUG oslo_concurrency.lockutils [req-2c535626-9ad4-4946-9ca4-2405eb204046 req-d9f2120f-5e87-4825-8306-210b9ef31284 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.190 187247 DEBUG oslo_concurrency.lockutils [req-2c535626-9ad4-4946-9ca4-2405eb204046 req-d9f2120f-5e87-4825-8306-210b9ef31284 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.190 187247 DEBUG oslo_concurrency.lockutils [req-2c535626-9ad4-4946-9ca4-2405eb204046 req-d9f2120f-5e87-4825-8306-210b9ef31284 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.190 187247 DEBUG nova.compute.manager [req-2c535626-9ad4-4946-9ca4-2405eb204046 req-d9f2120f-5e87-4825-8306-210b9ef31284 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] No waiting events found dispatching network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.191 187247 DEBUG nova.compute.manager [req-2c535626-9ad4-4946-9ca4-2405eb204046 req-d9f2120f-5e87-4825-8306-210b9ef31284 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.207 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.621 187247 DEBUG nova.network.neutron [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port d8c14b2b-88f1-46e9-af74-d11479fced60 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.621 187247 DEBUG nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.622 187247 DEBUG nova.virt.libvirt.vif [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:23:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-802250903',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-802250903',id=30,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:23:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-s8j6lm3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:24:30Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=89b22e0d-2f57-40f3-8c02-38af8f0ac9ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.623 187247 DEBUG nova.network.os_vif_util [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "d8c14b2b-88f1-46e9-af74-d11479fced60", "address": "fa:16:3e:ec:0e:e1", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8c14b2b-88", "ovs_interfaceid": "d8c14b2b-88f1-46e9-af74-d11479fced60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.623 187247 DEBUG nova.network.os_vif_util [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.623 187247 DEBUG os_vif [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.625 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.625 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8c14b2b-88, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.627 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.628 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.629 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.629 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=83cb48ed-8702-48f5-aaa7-a5ddeba51301) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.630 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.631 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.633 187247 INFO os_vif [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:0e:e1,bridge_name='br-int',has_traffic_filtering=True,id=d8c14b2b-88f1-46e9-af74-d11479fced60,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8c14b2b-88')
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.633 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.634 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.634 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.634 187247 DEBUG nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.634 187247 INFO nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Deleting instance files /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab_del
Dec 03 00:24:54 compute-0 nova_compute[187243]: 2025-12-03 00:24:54.635 187247 INFO nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Deletion of /var/lib/nova/instances/89b22e0d-2f57-40f3-8c02-38af8f0ac9ab_del complete
Dec 03 00:24:55 compute-0 nova_compute[187243]: 2025-12-03 00:24:55.007 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.252 187247 DEBUG nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.252 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.252 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.252 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.253 187247 DEBUG nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] No waiting events found dispatching network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.253 187247 WARNING nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received unexpected event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 for instance with vm_state active and task_state migrating.
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.253 187247 DEBUG nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.253 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.253 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.253 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.254 187247 DEBUG nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] No waiting events found dispatching network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.254 187247 DEBUG nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.254 187247 DEBUG nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.254 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.254 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.254 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.255 187247 DEBUG nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] No waiting events found dispatching network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.255 187247 DEBUG nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-unplugged-d8c14b2b-88f1-46e9-af74-d11479fced60 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.255 187247 DEBUG nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.255 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.255 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.256 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.256 187247 DEBUG nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] No waiting events found dispatching network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.256 187247 WARNING nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received unexpected event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 for instance with vm_state active and task_state migrating.
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.256 187247 DEBUG nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.256 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.256 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.256 187247 DEBUG oslo_concurrency.lockutils [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.257 187247 DEBUG nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] No waiting events found dispatching network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.257 187247 WARNING nova.compute.manager [req-bd2d13a4-2a0e-4c27-9a61-56e99fe8909f req-0e6a88c0-4b3f-4b2d-b604-f45351d31164 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Received unexpected event network-vif-plugged-d8c14b2b-88f1-46e9-af74-d11479fced60 for instance with vm_state active and task_state migrating.
Dec 03 00:24:56 compute-0 nova_compute[187243]: 2025-12-03 00:24:56.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:24:57 compute-0 nova_compute[187243]: 2025-12-03 00:24:57.107 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:57 compute-0 nova_compute[187243]: 2025-12-03 00:24:57.108 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:57 compute-0 nova_compute[187243]: 2025-12-03 00:24:57.108 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:57 compute-0 nova_compute[187243]: 2025-12-03 00:24:57.109 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:24:57 compute-0 nova_compute[187243]: 2025-12-03 00:24:57.256 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:24:57 compute-0 nova_compute[187243]: 2025-12-03 00:24:57.257 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:24:57 compute-0 nova_compute[187243]: 2025-12-03 00:24:57.280 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:24:57 compute-0 nova_compute[187243]: 2025-12-03 00:24:57.281 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5816MB free_disk=73.16226577758789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:24:57 compute-0 nova_compute[187243]: 2025-12-03 00:24:57.281 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:24:57 compute-0 nova_compute[187243]: 2025-12-03 00:24:57.281 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:24:58 compute-0 nova_compute[187243]: 2025-12-03 00:24:58.312 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Updating resource usage from migration de51b730-21c6-4368-a536-eef3863ee14c
Dec 03 00:24:58 compute-0 nova_compute[187243]: 2025-12-03 00:24:58.379 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Migration de51b730-21c6-4368-a536-eef3863ee14c is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:24:58 compute-0 nova_compute[187243]: 2025-12-03 00:24:58.379 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:24:58 compute-0 nova_compute[187243]: 2025-12-03 00:24:58.379 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:24:57 up  1:33,  0 user,  load average: 0.23, 0.27, 0.27\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_a8545a5c94f84697a8605fadf08251f7': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:24:58 compute-0 nova_compute[187243]: 2025-12-03 00:24:58.425 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:24:58 compute-0 nova_compute[187243]: 2025-12-03 00:24:58.932 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:24:59 compute-0 nova_compute[187243]: 2025-12-03 00:24:59.440 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:24:59 compute-0 nova_compute[187243]: 2025-12-03 00:24:59.440 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.159s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:24:59 compute-0 nova_compute[187243]: 2025-12-03 00:24:59.631 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:24:59 compute-0 podman[197600]: time="2025-12-03T00:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:24:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:24:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Dec 03 00:25:00 compute-0 nova_compute[187243]: 2025-12-03 00:25:00.008 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:00.727 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:00.728 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:00.728 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:01 compute-0 openstack_network_exporter[199746]: ERROR   00:25:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:25:01 compute-0 openstack_network_exporter[199746]: ERROR   00:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:25:01 compute-0 openstack_network_exporter[199746]: ERROR   00:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:25:01 compute-0 openstack_network_exporter[199746]: ERROR   00:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:25:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:25:01 compute-0 openstack_network_exporter[199746]: ERROR   00:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:25:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:25:01 compute-0 nova_compute[187243]: 2025-12-03 00:25:01.440 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:01 compute-0 nova_compute[187243]: 2025-12-03 00:25:01.440 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:01 compute-0 nova_compute[187243]: 2025-12-03 00:25:01.440 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:25:01 compute-0 nova_compute[187243]: 2025-12-03 00:25:01.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:01 compute-0 nova_compute[187243]: 2025-12-03 00:25:01.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:03 compute-0 nova_compute[187243]: 2025-12-03 00:25:03.675 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:03 compute-0 nova_compute[187243]: 2025-12-03 00:25:03.675 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:03 compute-0 nova_compute[187243]: 2025-12-03 00:25:03.675 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "89b22e0d-2f57-40f3-8c02-38af8f0ac9ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:04 compute-0 podman[222303]: 2025-12-03 00:25:04.101389485 +0000 UTC m=+0.056403123 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible)
Dec 03 00:25:04 compute-0 podman[222304]: 2025-12-03 00:25:04.108168641 +0000 UTC m=+0.057295485 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible)
Dec 03 00:25:04 compute-0 nova_compute[187243]: 2025-12-03 00:25:04.187 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:04 compute-0 nova_compute[187243]: 2025-12-03 00:25:04.187 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:04 compute-0 nova_compute[187243]: 2025-12-03 00:25:04.188 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:04 compute-0 nova_compute[187243]: 2025-12-03 00:25:04.188 187247 DEBUG nova.compute.resource_tracker [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:25:04 compute-0 nova_compute[187243]: 2025-12-03 00:25:04.320 187247 WARNING nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:25:04 compute-0 nova_compute[187243]: 2025-12-03 00:25:04.321 187247 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:04 compute-0 nova_compute[187243]: 2025-12-03 00:25:04.338 187247 DEBUG oslo_concurrency.processutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:04 compute-0 nova_compute[187243]: 2025-12-03 00:25:04.339 187247 DEBUG nova.compute.resource_tracker [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5828MB free_disk=73.16229629516602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:25:04 compute-0 nova_compute[187243]: 2025-12-03 00:25:04.339 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:04 compute-0 nova_compute[187243]: 2025-12-03 00:25:04.339 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:04 compute-0 nova_compute[187243]: 2025-12-03 00:25:04.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:04 compute-0 nova_compute[187243]: 2025-12-03 00:25:04.633 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:05 compute-0 nova_compute[187243]: 2025-12-03 00:25:05.037 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:05 compute-0 nova_compute[187243]: 2025-12-03 00:25:05.357 187247 DEBUG nova.compute.resource_tracker [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:25:05 compute-0 nova_compute[187243]: 2025-12-03 00:25:05.867 187247 DEBUG nova.compute.resource_tracker [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:25:05 compute-0 nova_compute[187243]: 2025-12-03 00:25:05.896 187247 DEBUG nova.compute.resource_tracker [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration de51b730-21c6-4368-a536-eef3863ee14c is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:25:05 compute-0 nova_compute[187243]: 2025-12-03 00:25:05.896 187247 DEBUG nova.compute.resource_tracker [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:25:05 compute-0 nova_compute[187243]: 2025-12-03 00:25:05.897 187247 DEBUG nova.compute.resource_tracker [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:25:04 up  1:33,  0 user,  load average: 0.21, 0.26, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:25:05 compute-0 nova_compute[187243]: 2025-12-03 00:25:05.946 187247 DEBUG nova.compute.provider_tree [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:25:06 compute-0 nova_compute[187243]: 2025-12-03 00:25:06.453 187247 DEBUG nova.scheduler.client.report [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:25:06 compute-0 nova_compute[187243]: 2025-12-03 00:25:06.967 187247 DEBUG nova.compute.resource_tracker [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:25:06 compute-0 nova_compute[187243]: 2025-12-03 00:25:06.968 187247 DEBUG oslo_concurrency.lockutils [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.628s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:06 compute-0 nova_compute[187243]: 2025-12-03 00:25:06.990 187247 INFO nova.compute.manager [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 03 00:25:08 compute-0 nova_compute[187243]: 2025-12-03 00:25:08.122 187247 INFO nova.scheduler.client.report [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration de51b730-21c6-4368-a536-eef3863ee14c
Dec 03 00:25:08 compute-0 nova_compute[187243]: 2025-12-03 00:25:08.123 187247 DEBUG nova.virt.libvirt.driver [None req-3c0b6112-6246-4e7b-804c-7a70e2460000 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 89b22e0d-2f57-40f3-8c02-38af8f0ac9ab] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:25:09 compute-0 nova_compute[187243]: 2025-12-03 00:25:09.635 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:10 compute-0 nova_compute[187243]: 2025-12-03 00:25:10.041 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:14 compute-0 nova_compute[187243]: 2025-12-03 00:25:14.587 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:14 compute-0 nova_compute[187243]: 2025-12-03 00:25:14.679 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:15 compute-0 nova_compute[187243]: 2025-12-03 00:25:15.042 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:15 compute-0 sshd-session[222347]: Invalid user syncthing from 101.47.140.127 port 35002
Dec 03 00:25:15 compute-0 podman[222349]: 2025-12-03 00:25:15.450496966 +0000 UTC m=+0.045126246 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:25:15 compute-0 sshd-session[222347]: Received disconnect from 101.47.140.127 port 35002:11: Bye Bye [preauth]
Dec 03 00:25:15 compute-0 sshd-session[222347]: Disconnected from invalid user syncthing 101.47.140.127 port 35002 [preauth]
Dec 03 00:25:19 compute-0 podman[222373]: 2025-12-03 00:25:19.092372462 +0000 UTC m=+0.054469807 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4)
Dec 03 00:25:19 compute-0 podman[222374]: 2025-12-03 00:25:19.121463141 +0000 UTC m=+0.079674610 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Dec 03 00:25:19 compute-0 nova_compute[187243]: 2025-12-03 00:25:19.681 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:20 compute-0 sshd-session[222417]: Invalid user admin1 from 20.123.120.169 port 44432
Dec 03 00:25:20 compute-0 nova_compute[187243]: 2025-12-03 00:25:20.043 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:20 compute-0 sshd-session[222417]: Received disconnect from 20.123.120.169 port 44432:11: Bye Bye [preauth]
Dec 03 00:25:20 compute-0 sshd-session[222417]: Disconnected from invalid user admin1 20.123.120.169 port 44432 [preauth]
Dec 03 00:25:24 compute-0 nova_compute[187243]: 2025-12-03 00:25:24.683 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:24 compute-0 nova_compute[187243]: 2025-12-03 00:25:24.975 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:24 compute-0 nova_compute[187243]: 2025-12-03 00:25:24.975 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:25 compute-0 nova_compute[187243]: 2025-12-03 00:25:25.045 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:25 compute-0 nova_compute[187243]: 2025-12-03 00:25:25.484 187247 DEBUG nova.compute.manager [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 03 00:25:26 compute-0 nova_compute[187243]: 2025-12-03 00:25:26.031 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:26 compute-0 nova_compute[187243]: 2025-12-03 00:25:26.031 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:26 compute-0 nova_compute[187243]: 2025-12-03 00:25:26.037 187247 DEBUG nova.virt.hardware [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 03 00:25:26 compute-0 nova_compute[187243]: 2025-12-03 00:25:26.037 187247 INFO nova.compute.claims [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Claim successful on node compute-0.ctlplane.example.com
Dec 03 00:25:27 compute-0 nova_compute[187243]: 2025-12-03 00:25:27.082 187247 DEBUG nova.compute.provider_tree [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:25:27 compute-0 nova_compute[187243]: 2025-12-03 00:25:27.589 187247 DEBUG nova.scheduler.client.report [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:25:28 compute-0 nova_compute[187243]: 2025-12-03 00:25:28.101 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.069s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:28 compute-0 nova_compute[187243]: 2025-12-03 00:25:28.102 187247 DEBUG nova.compute.manager [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 03 00:25:28 compute-0 nova_compute[187243]: 2025-12-03 00:25:28.612 187247 DEBUG nova.compute.manager [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 03 00:25:28 compute-0 nova_compute[187243]: 2025-12-03 00:25:28.612 187247 DEBUG nova.network.neutron [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 03 00:25:28 compute-0 nova_compute[187243]: 2025-12-03 00:25:28.612 187247 WARNING neutronclient.v2_0.client [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:28 compute-0 nova_compute[187243]: 2025-12-03 00:25:28.613 187247 WARNING neutronclient.v2_0.client [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:29 compute-0 nova_compute[187243]: 2025-12-03 00:25:29.134 187247 INFO nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 00:25:29 compute-0 sshd-session[222419]: Invalid user max from 45.78.219.213 port 45544
Dec 03 00:25:29 compute-0 nova_compute[187243]: 2025-12-03 00:25:29.299 187247 DEBUG nova.network.neutron [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Successfully created port: 697e2ff1-393b-4c81-abc1-b7afc93f0e5b _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 03 00:25:29 compute-0 sshd-session[222419]: Received disconnect from 45.78.219.213 port 45544:11: Bye Bye [preauth]
Dec 03 00:25:29 compute-0 sshd-session[222419]: Disconnected from invalid user max 45.78.219.213 port 45544 [preauth]
Dec 03 00:25:29 compute-0 nova_compute[187243]: 2025-12-03 00:25:29.651 187247 DEBUG nova.compute.manager [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 03 00:25:29 compute-0 nova_compute[187243]: 2025-12-03 00:25:29.685 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:29 compute-0 podman[197600]: time="2025-12-03T00:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:25:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:25:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2607 "" "Go-http-client/1.1"
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.046 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.254 187247 DEBUG nova.network.neutron [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Successfully updated port: 697e2ff1-393b-4c81-abc1-b7afc93f0e5b _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.313 187247 DEBUG nova.compute.manager [req-bb27ca6c-4403-4bb6-87c3-641411e70c25 req-964169e9-5d4c-4f93-8715-0c82d7140579 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-changed-697e2ff1-393b-4c81-abc1-b7afc93f0e5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.313 187247 DEBUG nova.compute.manager [req-bb27ca6c-4403-4bb6-87c3-641411e70c25 req-964169e9-5d4c-4f93-8715-0c82d7140579 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Refreshing instance network info cache due to event network-changed-697e2ff1-393b-4c81-abc1-b7afc93f0e5b. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.314 187247 DEBUG oslo_concurrency.lockutils [req-bb27ca6c-4403-4bb6-87c3-641411e70c25 req-964169e9-5d4c-4f93-8715-0c82d7140579 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.314 187247 DEBUG oslo_concurrency.lockutils [req-bb27ca6c-4403-4bb6-87c3-641411e70c25 req-964169e9-5d4c-4f93-8715-0c82d7140579 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.314 187247 DEBUG nova.network.neutron [req-bb27ca6c-4403-4bb6-87c3-641411e70c25 req-964169e9-5d4c-4f93-8715-0c82d7140579 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Refreshing network info cache for port 697e2ff1-393b-4c81-abc1-b7afc93f0e5b _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.669 187247 DEBUG nova.compute.manager [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.670 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.671 187247 INFO nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Creating image(s)
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.671 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.671 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.672 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.672 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.675 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.677 187247 DEBUG oslo_concurrency.processutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.760 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.761 187247 DEBUG oslo_concurrency.processutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.762 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.762 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.763 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.766 187247 DEBUG oslo_utils.imageutils.format_inspector [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.766 187247 DEBUG oslo_concurrency.processutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.821 187247 WARNING neutronclient.v2_0.client [req-bb27ca6c-4403-4bb6-87c3-641411e70c25 req-964169e9-5d4c-4f93-8715-0c82d7140579 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.827 187247 DEBUG oslo_concurrency.processutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.828 187247 DEBUG oslo_concurrency.processutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.869 187247 DEBUG oslo_concurrency.processutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0,backing_fmt=raw /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.870 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "4b320fb713d74cf7cc71d0105cd653f0a14ecaa0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.870 187247 DEBUG oslo_concurrency.processutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.929 187247 DEBUG oslo_concurrency.processutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4b320fb713d74cf7cc71d0105cd653f0a14ecaa0 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.930 187247 DEBUG nova.virt.disk.api [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Checking if we can resize image /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.930 187247 DEBUG oslo_concurrency.processutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.983 187247 DEBUG oslo_concurrency.processutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.984 187247 DEBUG nova.virt.disk.api [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Cannot resize image /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.984 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.984 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Ensure instance console log exists: /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.985 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.985 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:30 compute-0 nova_compute[187243]: 2025-12-03 00:25:30.985 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:31 compute-0 nova_compute[187243]: 2025-12-03 00:25:31.088 187247 DEBUG nova.network.neutron [req-bb27ca6c-4403-4bb6-87c3-641411e70c25 req-964169e9-5d4c-4f93-8715-0c82d7140579 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:25:31 compute-0 openstack_network_exporter[199746]: ERROR   00:25:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:25:31 compute-0 openstack_network_exporter[199746]: ERROR   00:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:25:31 compute-0 openstack_network_exporter[199746]: ERROR   00:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:25:31 compute-0 openstack_network_exporter[199746]: ERROR   00:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:25:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:25:31 compute-0 openstack_network_exporter[199746]: ERROR   00:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:25:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:25:33 compute-0 nova_compute[187243]: 2025-12-03 00:25:33.220 187247 DEBUG nova.network.neutron [req-bb27ca6c-4403-4bb6-87c3-641411e70c25 req-964169e9-5d4c-4f93-8715-0c82d7140579 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:25:33 compute-0 sshd-session[222421]: Connection closed by 45.78.219.95 port 47044 [preauth]
Dec 03 00:25:33 compute-0 nova_compute[187243]: 2025-12-03 00:25:33.726 187247 DEBUG oslo_concurrency.lockutils [req-bb27ca6c-4403-4bb6-87c3-641411e70c25 req-964169e9-5d4c-4f93-8715-0c82d7140579 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:25:33 compute-0 nova_compute[187243]: 2025-12-03 00:25:33.727 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquired lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:25:33 compute-0 nova_compute[187243]: 2025-12-03 00:25:33.727 187247 DEBUG nova.network.neutron [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 03 00:25:34 compute-0 nova_compute[187243]: 2025-12-03 00:25:34.345 187247 DEBUG nova.network.neutron [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 03 00:25:34 compute-0 nova_compute[187243]: 2025-12-03 00:25:34.537 187247 WARNING neutronclient.v2_0.client [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:34 compute-0 nova_compute[187243]: 2025-12-03 00:25:34.687 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:34 compute-0 nova_compute[187243]: 2025-12-03 00:25:34.692 187247 DEBUG nova.network.neutron [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Updating instance_info_cache with network_info: [{"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.092 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:35 compute-0 podman[222438]: 2025-12-03 00:25:35.114583066 +0000 UTC m=+0.064849673 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4)
Dec 03 00:25:35 compute-0 podman[222439]: 2025-12-03 00:25:35.121052036 +0000 UTC m=+0.063855889 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.199 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Releasing lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.199 187247 DEBUG nova.compute.manager [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Instance network_info: |[{"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.201 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Start _get_guest_xml network_info=[{"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'size': 0, 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '92e79321-71af-44a0-869c-1d5a9da5fefc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.204 187247 WARNING nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.205 187247 DEBUG nova.virt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='92e79321-71af-44a0-869c-1d5a9da5fefc', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-1695178384', uuid='20d06540-44a6-4c4c-ab2f-d4997af86fa0'), owner=OwnerMeta(userid='43c8524f2d244e8aa3019dd878dcfb81', username='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin', projectid='a8545a5c94f84697a8605fadf08251f7', projectname='tempest-TestExecuteZoneMigrationStrategy-558903593'), image=ImageMeta(id='92e79321-71af-44a0-869c-1d5a9da5fefc', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764721535.2054381) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.211 187247 DEBUG nova.virt.libvirt.host [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.211 187247 DEBUG nova.virt.libvirt.host [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.214 187247 DEBUG nova.virt.libvirt.host [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.214 187247 DEBUG nova.virt.libvirt.host [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.215 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.216 187247 DEBUG nova.virt.hardware [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-02T23:47:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b2669e62-ef04-4b34-b3d6-69efcfbafbdc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-02T23:47:52Z,direct_url=<?>,disk_format='qcow2',id=92e79321-71af-44a0-869c-1d5a9da5fefc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='22106c97f2524355a0bbadb98eaf5c22',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-02T23:47:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.216 187247 DEBUG nova.virt.hardware [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.216 187247 DEBUG nova.virt.hardware [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.216 187247 DEBUG nova.virt.hardware [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.217 187247 DEBUG nova.virt.hardware [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.217 187247 DEBUG nova.virt.hardware [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.217 187247 DEBUG nova.virt.hardware [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.217 187247 DEBUG nova.virt.hardware [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.217 187247 DEBUG nova.virt.hardware [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.217 187247 DEBUG nova.virt.hardware [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.217 187247 DEBUG nova.virt.hardware [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.221 187247 DEBUG nova.virt.libvirt.vif [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:25:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1695178384',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1695178384',id=32,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-sxcn790n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:25:29Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=20d06540-44a6-4c4c-ab2f-d4997af86fa0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.221 187247 DEBUG nova.network.os_vif_util [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converting VIF {"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.221 187247 DEBUG nova.network.os_vif_util [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.222 187247 DEBUG nova.objects.instance [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 20d06540-44a6-4c4c-ab2f-d4997af86fa0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.730 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] End _get_guest_xml xml=<domain type="kvm">
Dec 03 00:25:35 compute-0 nova_compute[187243]:   <uuid>20d06540-44a6-4c4c-ab2f-d4997af86fa0</uuid>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   <name>instance-00000020</name>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   <memory>131072</memory>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   <vcpu>1</vcpu>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1695178384</nova:name>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:25:35</nova:creationTime>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:25:35 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:25:35 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:user uuid="43c8524f2d244e8aa3019dd878dcfb81">tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin</nova:user>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:project uuid="a8545a5c94f84697a8605fadf08251f7">tempest-TestExecuteZoneMigrationStrategy-558903593</nova:project>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         <nova:port uuid="697e2ff1-393b-4c81-abc1-b7afc93f0e5b">
Dec 03 00:25:35 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <system>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <entry name="serial">20d06540-44a6-4c4c-ab2f-d4997af86fa0</entry>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <entry name="uuid">20d06540-44a6-4c4c-ab2f-d4997af86fa0</entry>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     </system>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   <os>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   </os>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   <features>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <vmcoreinfo/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   </features>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact">
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <model>Nehalem</model>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk.config"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <interface type="ethernet">
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <mac address="fa:16:3e:d1:af:d0"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <mtu size="1442"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <target dev="tap697e2ff1-39"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     </interface>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <serial type="pty">
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/console.log" append="off"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <video>
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <model type="virtio"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     </video>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <controller type="usb" index="0"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:25:35 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:25:35 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:25:35 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:25:35 compute-0 nova_compute[187243]: </domain>
Dec 03 00:25:35 compute-0 nova_compute[187243]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.731 187247 DEBUG nova.compute.manager [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Preparing to wait for external event network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.731 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.732 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.732 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.732 187247 DEBUG nova.virt.libvirt.vif [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-12-03T00:25:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1695178384',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1695178384',id=32,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-sxcn790n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T00:25:29Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=20d06540-44a6-4c4c-ab2f-d4997af86fa0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.733 187247 DEBUG nova.network.os_vif_util [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converting VIF {"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.733 187247 DEBUG nova.network.os_vif_util [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.733 187247 DEBUG os_vif [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.734 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.734 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.735 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.735 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.735 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8b78b72f-5a3b-542b-b870-cfef03cf2b5f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.736 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.738 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.740 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.740 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap697e2ff1-39, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.741 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap697e2ff1-39, col_values=(('qos', UUID('58db511f-ee8b-4eef-b32d-0cd6482eb4b3')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.741 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap697e2ff1-39, col_values=(('external_ids', {'iface-id': '697e2ff1-393b-4c81-abc1-b7afc93f0e5b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:af:d0', 'vm-uuid': '20d06540-44a6-4c4c-ab2f-d4997af86fa0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.742 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:35 compute-0 NetworkManager[55671]: <info>  [1764721535.7428] manager: (tap697e2ff1-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.745 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.747 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:35 compute-0 nova_compute[187243]: 2025-12-03 00:25:35.748 187247 INFO os_vif [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39')
Dec 03 00:25:37 compute-0 nova_compute[187243]: 2025-12-03 00:25:37.300 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:25:37 compute-0 nova_compute[187243]: 2025-12-03 00:25:37.300 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 03 00:25:37 compute-0 nova_compute[187243]: 2025-12-03 00:25:37.300 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] No VIF found with MAC fa:16:3e:d1:af:d0, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 03 00:25:37 compute-0 nova_compute[187243]: 2025-12-03 00:25:37.301 187247 INFO nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Using config drive
Dec 03 00:25:37 compute-0 nova_compute[187243]: 2025-12-03 00:25:37.835 187247 WARNING neutronclient.v2_0.client [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.183 187247 INFO nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Creating config drive at /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk.config
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.187 187247 DEBUG oslo_concurrency.processutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpqjs7wb_r execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.311 187247 DEBUG oslo_concurrency.processutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpqjs7wb_r" returned: 0 in 0.124s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:25:38 compute-0 kernel: tap697e2ff1-39: entered promiscuous mode
Dec 03 00:25:38 compute-0 NetworkManager[55671]: <info>  [1764721538.3748] manager: (tap697e2ff1-39): new Tun device (/org/freedesktop/NetworkManager/Devices/94)
Dec 03 00:25:38 compute-0 ovn_controller[95488]: 2025-12-03T00:25:38Z|00236|binding|INFO|Claiming lport 697e2ff1-393b-4c81-abc1-b7afc93f0e5b for this chassis.
Dec 03 00:25:38 compute-0 ovn_controller[95488]: 2025-12-03T00:25:38Z|00237|binding|INFO|697e2ff1-393b-4c81-abc1-b7afc93f0e5b: Claiming fa:16:3e:d1:af:d0 10.100.0.11
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.427 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:38 compute-0 systemd-udevd[222496]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.435 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:af:d0 10.100.0.11'], port_security=['fa:16:3e:d1:af:d0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '20d06540-44a6-4c4c-ab2f-d4997af86fa0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '85b55f5e-0cbc-47d6-baaa-5c5f70692f0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=697e2ff1-393b-4c81-abc1-b7afc93f0e5b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.436 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 697e2ff1-393b-4c81-abc1-b7afc93f0e5b in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 bound to our chassis
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.437 104379 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:25:38 compute-0 NetworkManager[55671]: <info>  [1764721538.4423] device (tap697e2ff1-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 00:25:38 compute-0 ovn_controller[95488]: 2025-12-03T00:25:38Z|00238|binding|INFO|Setting lport 697e2ff1-393b-4c81-abc1-b7afc93f0e5b ovn-installed in OVS
Dec 03 00:25:38 compute-0 ovn_controller[95488]: 2025-12-03T00:25:38Z|00239|binding|INFO|Setting lport 697e2ff1-393b-4c81-abc1-b7afc93f0e5b up in Southbound
Dec 03 00:25:38 compute-0 NetworkManager[55671]: <info>  [1764721538.4435] device (tap697e2ff1-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.444 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.449 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe5d9a0-a90d-4149-a516-1d34dfb099d0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.449 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf7a76663-51 in ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.453 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf7a76663-50 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.454 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8c9f1f13-f5ee-4995-b148-3fd899b1ad4b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.455 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[aaaf38ff-0d57-4c7f-9af9-05d449819dd4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 systemd-machined[153518]: New machine qemu-22-instance-00000020.
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.465 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[729b3eae-98dd-497f-b335-f89b31316e92]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-00000020.
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.481 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a38b21e1-33ad-4a56-b756-5197ad8cbd3e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.510 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[86a91f24-a954-4021-b932-7b605860c732]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 NetworkManager[55671]: <info>  [1764721538.5153] manager: (tapf7a76663-50): new Veth device (/org/freedesktop/NetworkManager/Devices/95)
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.514 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d10f010f-a2b3-4423-9270-790d8501a8cc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 systemd-udevd[222501]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.541 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[9c35a5ab-4b07-48da-a058-0f78f889d1c2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.544 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[c9619b42-7a12-4bae-b372-34a96f93e7a7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 NetworkManager[55671]: <info>  [1764721538.5694] device (tapf7a76663-50): carrier: link connected
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.580 209783 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d10328-c7ae-4a82-a71b-7094eda0da63]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.598 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[61d5efdb-760e-4de6-a181-747fdb7883a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7a76663-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:f9:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562921, 'reachable_time': 38207, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222534, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.613 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[887589e9-2b5d-42ab-a14c-d992defc0fa9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9c:f9e6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 562921, 'tstamp': 562921}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222535, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.627 187247 DEBUG nova.compute.manager [req-e97937f4-8e6f-467b-8c45-93d13c08e399 req-05f62c5d-5386-4bbf-aa1a-cad12587c5ef 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.627 187247 DEBUG oslo_concurrency.lockutils [req-e97937f4-8e6f-467b-8c45-93d13c08e399 req-05f62c5d-5386-4bbf-aa1a-cad12587c5ef 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.627 187247 DEBUG oslo_concurrency.lockutils [req-e97937f4-8e6f-467b-8c45-93d13c08e399 req-05f62c5d-5386-4bbf-aa1a-cad12587c5ef 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.628 187247 DEBUG oslo_concurrency.lockutils [req-e97937f4-8e6f-467b-8c45-93d13c08e399 req-05f62c5d-5386-4bbf-aa1a-cad12587c5ef 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.628 187247 DEBUG nova.compute.manager [req-e97937f4-8e6f-467b-8c45-93d13c08e399 req-05f62c5d-5386-4bbf-aa1a-cad12587c5ef 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Processing event network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.630 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ea80ac-e5dd-4a96-9ee5-44202ae30a9e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7a76663-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:f9:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562921, 'reachable_time': 38207, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222536, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.658 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0c4ffa47-a429-48da-858e-64807b567ee8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.720 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[237f80d8-24ff-492a-93e2-b46861621063]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.722 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a76663-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.723 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 03 00:25:38 compute-0 kernel: tapf7a76663-50: entered promiscuous mode
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.723 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7a76663-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:38 compute-0 NetworkManager[55671]: <info>  [1764721538.7262] manager: (tapf7a76663-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.725 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.733 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7a76663-50, col_values=(('external_ids', {'iface-id': '45446e36-d2c9-4ea6-b9fb-83e2711350dd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:25:38 compute-0 ovn_controller[95488]: 2025-12-03T00:25:38Z|00240|binding|INFO|Releasing lport 45446e36-d2c9-4ea6-b9fb-83e2711350dd from this chassis (sb_readonly=0)
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.735 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.737 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fb604613-bd50-4f1f-92d5-8a0cfd66d20f]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.738 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.738 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.738 104379 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.738 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.739 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a42217e1-b548-44d4-bba6-4fbcdbd2f26e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.739 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.740 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5def2c50-d675-456a-a858-18bb9304d727]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.740 104379 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: global
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     log         /dev/log local0 debug
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     log-tag     haproxy-metadata-proxy-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     user        root
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     group       root
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     maxconn     1024
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     pidfile     /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     daemon
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: defaults
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     log global
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     mode http
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     option httplog
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     option dontlognull
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     option http-server-close
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     option forwardfor
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     retries                 3
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     timeout http-request    30s
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     timeout connect         30s
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     timeout client          32s
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     timeout server          32s
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     timeout http-keep-alive 30s
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: listen listener
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     bind 169.254.169.254:80
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     server metadata /var/lib/neutron/metadata_proxy
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:     http-request add-header X-OVN-Network-ID f7a76663-52a3-4e8c-af8a-8ef26c8fecf2
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 03 00:25:38 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:38.741 104379 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'env', 'PROCESS_TAG=haproxy-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.920 187247 DEBUG nova.compute.manager [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.923 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.926 187247 INFO nova.virt.libvirt.driver [-] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Instance spawned successfully.
Dec 03 00:25:38 compute-0 nova_compute[187243]: 2025-12-03 00:25:38.927 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 03 00:25:39 compute-0 podman[222575]: 2025-12-03 00:25:39.114245345 +0000 UTC m=+0.023353758 image pull 334cbcdc18f586cd47c824ffcaed6895e1b163d38d8f2256bff4131829d0b436 38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Dec 03 00:25:39 compute-0 nova_compute[187243]: 2025-12-03 00:25:39.443 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:25:39 compute-0 nova_compute[187243]: 2025-12-03 00:25:39.443 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:25:39 compute-0 nova_compute[187243]: 2025-12-03 00:25:39.443 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:25:39 compute-0 nova_compute[187243]: 2025-12-03 00:25:39.444 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:25:39 compute-0 nova_compute[187243]: 2025-12-03 00:25:39.444 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:25:39 compute-0 nova_compute[187243]: 2025-12-03 00:25:39.444 187247 DEBUG nova.virt.libvirt.driver [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 03 00:25:39 compute-0 podman[222575]: 2025-12-03 00:25:39.808748628 +0000 UTC m=+0.717857021 container create 09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:25:39 compute-0 systemd[1]: Started libpod-conmon-09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21.scope.
Dec 03 00:25:39 compute-0 systemd[1]: Started libcrun container.
Dec 03 00:25:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4375adf7302661f64d0afaa53c100a1d93af8d423083608d11a8c9a5215ce3bd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 00:25:39 compute-0 podman[222575]: 2025-12-03 00:25:39.897657296 +0000 UTC m=+0.806765689 container init 09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 03 00:25:39 compute-0 podman[222575]: 2025-12-03 00:25:39.902398253 +0000 UTC m=+0.811506646 container start 09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest)
Dec 03 00:25:39 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[222591]: [NOTICE]   (222595) : New worker (222597) forked
Dec 03 00:25:39 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[222591]: [NOTICE]   (222595) : Loading success.
Dec 03 00:25:39 compute-0 nova_compute[187243]: 2025-12-03 00:25:39.954 187247 INFO nova.compute.manager [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Took 9.28 seconds to spawn the instance on the hypervisor.
Dec 03 00:25:39 compute-0 nova_compute[187243]: 2025-12-03 00:25:39.954 187247 DEBUG nova.compute.manager [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 03 00:25:40 compute-0 nova_compute[187243]: 2025-12-03 00:25:40.093 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:40 compute-0 nova_compute[187243]: 2025-12-03 00:25:40.498 187247 INFO nova.compute.manager [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Took 14.50 seconds to build instance.
Dec 03 00:25:40 compute-0 nova_compute[187243]: 2025-12-03 00:25:40.702 187247 DEBUG nova.compute.manager [req-829eb090-b71c-4fa4-a768-b1ef98fcbddc req-af9edd99-fc38-4e1b-a7bb-f88a52d68e59 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:25:40 compute-0 nova_compute[187243]: 2025-12-03 00:25:40.702 187247 DEBUG oslo_concurrency.lockutils [req-829eb090-b71c-4fa4-a768-b1ef98fcbddc req-af9edd99-fc38-4e1b-a7bb-f88a52d68e59 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:40 compute-0 nova_compute[187243]: 2025-12-03 00:25:40.703 187247 DEBUG oslo_concurrency.lockutils [req-829eb090-b71c-4fa4-a768-b1ef98fcbddc req-af9edd99-fc38-4e1b-a7bb-f88a52d68e59 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:40 compute-0 nova_compute[187243]: 2025-12-03 00:25:40.703 187247 DEBUG oslo_concurrency.lockutils [req-829eb090-b71c-4fa4-a768-b1ef98fcbddc req-af9edd99-fc38-4e1b-a7bb-f88a52d68e59 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:40 compute-0 nova_compute[187243]: 2025-12-03 00:25:40.703 187247 DEBUG nova.compute.manager [req-829eb090-b71c-4fa4-a768-b1ef98fcbddc req-af9edd99-fc38-4e1b-a7bb-f88a52d68e59 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] No waiting events found dispatching network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:25:40 compute-0 nova_compute[187243]: 2025-12-03 00:25:40.703 187247 WARNING nova.compute.manager [req-829eb090-b71c-4fa4-a768-b1ef98fcbddc req-af9edd99-fc38-4e1b-a7bb-f88a52d68e59 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received unexpected event network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b for instance with vm_state active and task_state None.
Dec 03 00:25:40 compute-0 nova_compute[187243]: 2025-12-03 00:25:40.742 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:41 compute-0 nova_compute[187243]: 2025-12-03 00:25:41.006 187247 DEBUG oslo_concurrency.lockutils [None req-fa6aa051-0c00-48b3-9286-317b973724ab 43c8524f2d244e8aa3019dd878dcfb81 a8545a5c94f84697a8605fadf08251f7 - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.031s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:41 compute-0 sshd-session[222497]: Invalid user vncuser from 45.78.222.160 port 60278
Dec 03 00:25:41 compute-0 sshd-session[222497]: Received disconnect from 45.78.222.160 port 60278:11: Bye Bye [preauth]
Dec 03 00:25:41 compute-0 sshd-session[222497]: Disconnected from invalid user vncuser 45.78.222.160 port 60278 [preauth]
Dec 03 00:25:45 compute-0 nova_compute[187243]: 2025-12-03 00:25:45.095 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:45 compute-0 nova_compute[187243]: 2025-12-03 00:25:45.744 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:46 compute-0 podman[222606]: 2025-12-03 00:25:46.090902787 +0000 UTC m=+0.045947857 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:25:50 compute-0 podman[222630]: 2025-12-03 00:25:50.093916067 +0000 UTC m=+0.052698253 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 00:25:50 compute-0 nova_compute[187243]: 2025-12-03 00:25:50.097 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:50 compute-0 podman[222631]: 2025-12-03 00:25:50.132614973 +0000 UTC m=+0.090079007 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 00:25:50 compute-0 nova_compute[187243]: 2025-12-03 00:25:50.746 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:52.117 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:25:52 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:25:52.117 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:25:52 compute-0 nova_compute[187243]: 2025-12-03 00:25:52.156 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:52 compute-0 sshd-session[222677]: Invalid user mika from 61.220.235.10 port 51456
Dec 03 00:25:52 compute-0 sshd-session[222677]: Received disconnect from 61.220.235.10 port 51456:11: Bye Bye [preauth]
Dec 03 00:25:52 compute-0 sshd-session[222677]: Disconnected from invalid user mika 61.220.235.10 port 51456 [preauth]
Dec 03 00:25:52 compute-0 nova_compute[187243]: 2025-12-03 00:25:52.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:53 compute-0 nova_compute[187243]: 2025-12-03 00:25:53.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:54 compute-0 ovn_controller[95488]: 2025-12-03T00:25:54Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d1:af:d0 10.100.0.11
Dec 03 00:25:54 compute-0 ovn_controller[95488]: 2025-12-03T00:25:54Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d1:af:d0 10.100.0.11
Dec 03 00:25:55 compute-0 nova_compute[187243]: 2025-12-03 00:25:55.101 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:55 compute-0 nova_compute[187243]: 2025-12-03 00:25:55.749 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:25:58 compute-0 nova_compute[187243]: 2025-12-03 00:25:58.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:58 compute-0 nova_compute[187243]: 2025-12-03 00:25:58.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:25:59 compute-0 nova_compute[187243]: 2025-12-03 00:25:59.107 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:25:59 compute-0 nova_compute[187243]: 2025-12-03 00:25:59.108 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:25:59 compute-0 nova_compute[187243]: 2025-12-03 00:25:59.109 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:25:59 compute-0 nova_compute[187243]: 2025-12-03 00:25:59.109 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:25:59 compute-0 podman[197600]: time="2025-12-03T00:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:25:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:25:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3065 "" "Go-http-client/1.1"
Dec 03 00:26:00 compute-0 nova_compute[187243]: 2025-12-03 00:26:00.103 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:00 compute-0 nova_compute[187243]: 2025-12-03 00:26:00.153 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:00 compute-0 nova_compute[187243]: 2025-12-03 00:26:00.227 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:00 compute-0 nova_compute[187243]: 2025-12-03 00:26:00.228 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:00 compute-0 nova_compute[187243]: 2025-12-03 00:26:00.280 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:00 compute-0 nova_compute[187243]: 2025-12-03 00:26:00.408 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:26:00 compute-0 nova_compute[187243]: 2025-12-03 00:26:00.409 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:00 compute-0 nova_compute[187243]: 2025-12-03 00:26:00.425 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:00 compute-0 nova_compute[187243]: 2025-12-03 00:26:00.426 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5654MB free_disk=73.1329231262207GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:26:00 compute-0 nova_compute[187243]: 2025-12-03 00:26:00.426 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:00 compute-0 nova_compute[187243]: 2025-12-03 00:26:00.426 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:00.728 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:00.728 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:00.729 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:00 compute-0 nova_compute[187243]: 2025-12-03 00:26:00.751 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:01 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:01.118 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:01 compute-0 openstack_network_exporter[199746]: ERROR   00:26:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:26:01 compute-0 openstack_network_exporter[199746]: ERROR   00:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:26:01 compute-0 openstack_network_exporter[199746]: ERROR   00:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:26:01 compute-0 openstack_network_exporter[199746]: ERROR   00:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:26:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:26:01 compute-0 openstack_network_exporter[199746]: ERROR   00:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:26:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:26:01 compute-0 nova_compute[187243]: 2025-12-03 00:26:01.465 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance 20d06540-44a6-4c4c-ab2f-d4997af86fa0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 03 00:26:01 compute-0 nova_compute[187243]: 2025-12-03 00:26:01.466 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:26:01 compute-0 nova_compute[187243]: 2025-12-03 00:26:01.466 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:26:00 up  1:34,  0 user,  load average: 0.41, 0.30, 0.28\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_a8545a5c94f84697a8605fadf08251f7': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:26:01 compute-0 nova_compute[187243]: 2025-12-03 00:26:01.496 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:26:02 compute-0 nova_compute[187243]: 2025-12-03 00:26:02.003 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:26:02 compute-0 nova_compute[187243]: 2025-12-03 00:26:02.512 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:26:02 compute-0 nova_compute[187243]: 2025-12-03 00:26:02.513 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.086s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:03 compute-0 nova_compute[187243]: 2025-12-03 00:26:03.513 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:03 compute-0 nova_compute[187243]: 2025-12-03 00:26:03.513 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:03 compute-0 nova_compute[187243]: 2025-12-03 00:26:03.513 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:03 compute-0 nova_compute[187243]: 2025-12-03 00:26:03.513 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:26:05 compute-0 nova_compute[187243]: 2025-12-03 00:26:05.145 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:05 compute-0 nova_compute[187243]: 2025-12-03 00:26:05.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:05 compute-0 nova_compute[187243]: 2025-12-03 00:26:05.753 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:06 compute-0 podman[222705]: 2025-12-03 00:26:06.120974331 +0000 UTC m=+0.070160815 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 03 00:26:06 compute-0 podman[222706]: 2025-12-03 00:26:06.140688728 +0000 UTC m=+0.082835388 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, version=9.6, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Dec 03 00:26:10 compute-0 nova_compute[187243]: 2025-12-03 00:26:10.147 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:10 compute-0 nova_compute[187243]: 2025-12-03 00:26:10.755 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:15 compute-0 nova_compute[187243]: 2025-12-03 00:26:15.149 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:15 compute-0 nova_compute[187243]: 2025-12-03 00:26:15.757 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:17 compute-0 podman[222748]: 2025-12-03 00:26:17.088224439 +0000 UTC m=+0.045793243 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:26:18 compute-0 sshd-session[222772]: Invalid user exx from 23.95.37.90 port 48940
Dec 03 00:26:18 compute-0 sshd-session[222772]: Received disconnect from 23.95.37.90 port 48940:11: Bye Bye [preauth]
Dec 03 00:26:18 compute-0 sshd-session[222772]: Disconnected from invalid user exx 23.95.37.90 port 48940 [preauth]
Dec 03 00:26:19 compute-0 nova_compute[187243]: 2025-12-03 00:26:19.898 187247 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Check if temp file /var/lib/nova/instances/tmpp2pv6ixr exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 03 00:26:19 compute-0 nova_compute[187243]: 2025-12-03 00:26:19.901 187247 DEBUG nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp2pv6ixr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='20d06540-44a6-4c4c-ab2f-d4997af86fa0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 03 00:26:20 compute-0 nova_compute[187243]: 2025-12-03 00:26:20.152 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:20 compute-0 nova_compute[187243]: 2025-12-03 00:26:20.759 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:21 compute-0 podman[222774]: 2025-12-03 00:26:21.093531456 +0000 UTC m=+0.051460673 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:26:21 compute-0 podman[222775]: 2025-12-03 00:26:21.127215038 +0000 UTC m=+0.081836613 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 00:26:25 compute-0 nova_compute[187243]: 2025-12-03 00:26:25.153 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:25 compute-0 nova_compute[187243]: 2025-12-03 00:26:25.761 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:25 compute-0 nova_compute[187243]: 2025-12-03 00:26:25.896 187247 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:25 compute-0 nova_compute[187243]: 2025-12-03 00:26:25.952 187247 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:25 compute-0 nova_compute[187243]: 2025-12-03 00:26:25.953 187247 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:26 compute-0 nova_compute[187243]: 2025-12-03 00:26:26.004 187247 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:26 compute-0 nova_compute[187243]: 2025-12-03 00:26:26.006 187247 DEBUG nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Preparing to wait for external event network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 03 00:26:26 compute-0 nova_compute[187243]: 2025-12-03 00:26:26.006 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:26 compute-0 nova_compute[187243]: 2025-12-03 00:26:26.006 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:26 compute-0 nova_compute[187243]: 2025-12-03 00:26:26.007 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:29 compute-0 podman[197600]: time="2025-12-03T00:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:26:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Dec 03 00:26:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3063 "" "Go-http-client/1.1"
Dec 03 00:26:30 compute-0 nova_compute[187243]: 2025-12-03 00:26:30.155 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:30 compute-0 ovn_controller[95488]: 2025-12-03T00:26:30Z|00241|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Dec 03 00:26:30 compute-0 nova_compute[187243]: 2025-12-03 00:26:30.763 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:31 compute-0 openstack_network_exporter[199746]: ERROR   00:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:26:31 compute-0 openstack_network_exporter[199746]: ERROR   00:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:26:31 compute-0 openstack_network_exporter[199746]: ERROR   00:26:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:26:31 compute-0 openstack_network_exporter[199746]: ERROR   00:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:26:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:26:31 compute-0 openstack_network_exporter[199746]: ERROR   00:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:26:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:26:35 compute-0 nova_compute[187243]: 2025-12-03 00:26:35.157 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:35 compute-0 nova_compute[187243]: 2025-12-03 00:26:35.803 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:36.570 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:26:36 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:36.570 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:26:36 compute-0 nova_compute[187243]: 2025-12-03 00:26:36.571 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:36 compute-0 nova_compute[187243]: 2025-12-03 00:26:36.601 187247 DEBUG nova.compute.manager [req-649951b2-d8f3-4bc4-a09a-30ddb06b334c req-c21f8d63-4650-49bb-8817-6134bbeaa985 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:26:36 compute-0 nova_compute[187243]: 2025-12-03 00:26:36.602 187247 DEBUG oslo_concurrency.lockutils [req-649951b2-d8f3-4bc4-a09a-30ddb06b334c req-c21f8d63-4650-49bb-8817-6134bbeaa985 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:36 compute-0 nova_compute[187243]: 2025-12-03 00:26:36.602 187247 DEBUG oslo_concurrency.lockutils [req-649951b2-d8f3-4bc4-a09a-30ddb06b334c req-c21f8d63-4650-49bb-8817-6134bbeaa985 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:36 compute-0 nova_compute[187243]: 2025-12-03 00:26:36.602 187247 DEBUG oslo_concurrency.lockutils [req-649951b2-d8f3-4bc4-a09a-30ddb06b334c req-c21f8d63-4650-49bb-8817-6134bbeaa985 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:36 compute-0 nova_compute[187243]: 2025-12-03 00:26:36.603 187247 DEBUG nova.compute.manager [req-649951b2-d8f3-4bc4-a09a-30ddb06b334c req-c21f8d63-4650-49bb-8817-6134bbeaa985 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] No event matching network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b in dict_keys([('network-vif-plugged', '697e2ff1-393b-4c81-abc1-b7afc93f0e5b')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 03 00:26:36 compute-0 nova_compute[187243]: 2025-12-03 00:26:36.603 187247 DEBUG nova.compute.manager [req-649951b2-d8f3-4bc4-a09a-30ddb06b334c req-c21f8d63-4650-49bb-8817-6134bbeaa985 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:26:37 compute-0 podman[222825]: 2025-12-03 00:26:37.101864167 +0000 UTC m=+0.060484496 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Dec 03 00:26:37 compute-0 podman[222826]: 2025-12-03 00:26:37.108616904 +0000 UTC m=+0.064743391 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 03 00:26:38 compute-0 nova_compute[187243]: 2025-12-03 00:26:38.091 187247 INFO nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Took 12.08 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 03 00:26:38 compute-0 nova_compute[187243]: 2025-12-03 00:26:38.659 187247 DEBUG nova.compute.manager [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:26:38 compute-0 nova_compute[187243]: 2025-12-03 00:26:38.660 187247 DEBUG oslo_concurrency.lockutils [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:38 compute-0 nova_compute[187243]: 2025-12-03 00:26:38.660 187247 DEBUG oslo_concurrency.lockutils [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:38 compute-0 nova_compute[187243]: 2025-12-03 00:26:38.660 187247 DEBUG oslo_concurrency.lockutils [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:38 compute-0 nova_compute[187243]: 2025-12-03 00:26:38.660 187247 DEBUG nova.compute.manager [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Processing event network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 03 00:26:38 compute-0 nova_compute[187243]: 2025-12-03 00:26:38.660 187247 DEBUG nova.compute.manager [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-changed-697e2ff1-393b-4c81-abc1-b7afc93f0e5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:26:38 compute-0 nova_compute[187243]: 2025-12-03 00:26:38.660 187247 DEBUG nova.compute.manager [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Refreshing instance network info cache due to event network-changed-697e2ff1-393b-4c81-abc1-b7afc93f0e5b. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 03 00:26:38 compute-0 nova_compute[187243]: 2025-12-03 00:26:38.660 187247 DEBUG oslo_concurrency.lockutils [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 03 00:26:38 compute-0 nova_compute[187243]: 2025-12-03 00:26:38.661 187247 DEBUG oslo_concurrency.lockutils [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquired lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 03 00:26:38 compute-0 nova_compute[187243]: 2025-12-03 00:26:38.661 187247 DEBUG nova.network.neutron [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Refreshing network info cache for port 697e2ff1-393b-4c81-abc1-b7afc93f0e5b _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 03 00:26:38 compute-0 nova_compute[187243]: 2025-12-03 00:26:38.662 187247 DEBUG nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 03 00:26:39 compute-0 nova_compute[187243]: 2025-12-03 00:26:39.167 187247 WARNING neutronclient.v2_0.client [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:39 compute-0 nova_compute[187243]: 2025-12-03 00:26:39.171 187247 DEBUG nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp2pv6ixr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='20d06540-44a6-4c4c-ab2f-d4997af86fa0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(469ff7b3-8c58-4bf8-aaf7-aa9867e5c0b7),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 03 00:26:39 compute-0 nova_compute[187243]: 2025-12-03 00:26:39.599 187247 WARNING neutronclient.v2_0.client [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:39 compute-0 nova_compute[187243]: 2025-12-03 00:26:39.685 187247 DEBUG nova.objects.instance [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lazy-loading 'migration_context' on Instance uuid 20d06540-44a6-4c4c-ab2f-d4997af86fa0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 03 00:26:39 compute-0 nova_compute[187243]: 2025-12-03 00:26:39.686 187247 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 03 00:26:39 compute-0 nova_compute[187243]: 2025-12-03 00:26:39.688 187247 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:26:39 compute-0 nova_compute[187243]: 2025-12-03 00:26:39.688 187247 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.160 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.191 187247 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.191 187247 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.245 187247 DEBUG nova.network.neutron [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Updated VIF entry in instance network info cache for port 697e2ff1-393b-4c81-abc1-b7afc93f0e5b. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.246 187247 DEBUG nova.network.neutron [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Updating instance_info_cache with network_info: [{"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.573 187247 DEBUG nova.virt.libvirt.vif [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:25:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1695178384',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1695178384',id=32,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:25:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-sxcn790n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:25:40Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=20d06540-44a6-4c4c-ab2f-d4997af86fa0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.574 187247 DEBUG nova.network.os_vif_util [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.574 187247 DEBUG nova.network.os_vif_util [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.575 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Updating guest XML with vif config: <interface type="ethernet">
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <mac address="fa:16:3e:d1:af:d0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <model type="virtio"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <driver name="vhost" rx_queue_size="512"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <mtu size="1442"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <target dev="tap697e2ff1-39"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]: </interface>
Dec 03 00:26:40 compute-0 nova_compute[187243]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.575 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <name>instance-00000020</name>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <uuid>20d06540-44a6-4c4c-ab2f-d4997af86fa0</uuid>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1695178384</nova:name>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:25:35</nova:creationTime>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:26:40 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:26:40 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:user uuid="43c8524f2d244e8aa3019dd878dcfb81">tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin</nova:user>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:project uuid="a8545a5c94f84697a8605fadf08251f7">tempest-TestExecuteZoneMigrationStrategy-558903593</nova:project>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:port uuid="697e2ff1-393b-4c81-abc1-b7afc93f0e5b">
Dec 03 00:26:40 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <system>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="serial">20d06540-44a6-4c4c-ab2f-d4997af86fa0</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="uuid">20d06540-44a6-4c4c-ab2f-d4997af86fa0</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </system>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <os>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </os>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <features>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </features>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk.config"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:d1:af:d0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap697e2ff1-39"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/console.log" append="off"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </target>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/console.log" append="off"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </console>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </input>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <video>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </video>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]: </domain>
Dec 03 00:26:40 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.577 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <name>instance-00000020</name>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <uuid>20d06540-44a6-4c4c-ab2f-d4997af86fa0</uuid>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1695178384</nova:name>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:25:35</nova:creationTime>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:26:40 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:26:40 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:user uuid="43c8524f2d244e8aa3019dd878dcfb81">tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin</nova:user>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:project uuid="a8545a5c94f84697a8605fadf08251f7">tempest-TestExecuteZoneMigrationStrategy-558903593</nova:project>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:port uuid="697e2ff1-393b-4c81-abc1-b7afc93f0e5b">
Dec 03 00:26:40 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <system>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="serial">20d06540-44a6-4c4c-ab2f-d4997af86fa0</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="uuid">20d06540-44a6-4c4c-ab2f-d4997af86fa0</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </system>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <os>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </os>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <features>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </features>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk.config"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:d1:af:d0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap697e2ff1-39"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/console.log" append="off"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </target>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/console.log" append="off"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </console>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </input>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <video>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </video>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]: </domain>
Dec 03 00:26:40 compute-0 nova_compute[187243]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.578 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <name>instance-00000020</name>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <uuid>20d06540-44a6-4c4c-ab2f-d4997af86fa0</uuid>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <metadata>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1695178384</nova:name>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:creationTime>2025-12-03 00:25:35</nova:creationTime>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:flavor name="m1.nano" id="b2669e62-ef04-4b34-b3d6-69efcfbafbdc">
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:memory>128</nova:memory>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:disk>1</nova:disk>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:swap>0</nova:swap>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:vcpus>1</nova:vcpus>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:extraSpecs>
Dec 03 00:26:40 compute-0 nova_compute[187243]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         </nova:extraSpecs>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </nova:flavor>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:image uuid="92e79321-71af-44a0-869c-1d5a9da5fefc">
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:minDisk>1</nova:minDisk>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:minRam>0</nova:minRam>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:properties>
Dec 03 00:26:40 compute-0 nova_compute[187243]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         </nova:properties>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </nova:image>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:owner>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:user uuid="43c8524f2d244e8aa3019dd878dcfb81">tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin</nova:user>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:project uuid="a8545a5c94f84697a8605fadf08251f7">tempest-TestExecuteZoneMigrationStrategy-558903593</nova:project>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </nova:owner>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:root type="image" uuid="92e79321-71af-44a0-869c-1d5a9da5fefc"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <nova:ports>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <nova:port uuid="697e2ff1-393b-4c81-abc1-b7afc93f0e5b">
Dec 03 00:26:40 compute-0 nova_compute[187243]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:         </nova:port>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </nova:ports>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </nova:instance>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </metadata>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <memory unit="KiB">131072</memory>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <vcpu placement="static">1</vcpu>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <resource>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <partition>/machine</partition>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </resource>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <sysinfo type="smbios">
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <system>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="manufacturer">RDO</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="product">OpenStack Compute</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="serial">20d06540-44a6-4c4c-ab2f-d4997af86fa0</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="uuid">20d06540-44a6-4c4c-ab2f-d4997af86fa0</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <entry name="family">Virtual Machine</entry>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </system>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </sysinfo>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <os>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <boot dev="hd"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <smbios mode="sysinfo"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </os>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <features>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <acpi/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <apic/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <vmcoreinfo state="on"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </features>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <cpu mode="custom" match="exact" check="partial">
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <model fallback="allow">Nehalem</model>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </cpu>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <clock offset="utc">
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <timer name="hpet" present="no"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </clock>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <on_poweroff>destroy</on_poweroff>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <on_reboot>restart</on_reboot>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <on_crash>destroy</on_crash>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <devices>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <disk type="file" device="disk">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target dev="vda" bus="virtio"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <disk type="file" device="cdrom">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <driver name="qemu" type="raw" cache="none"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <source file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/disk.config"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target dev="sda" bus="sata"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <readonly/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </disk>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="1" port="0x10"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="2" port="0x11"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="3" port="0x12"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="4" port="0x13"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="5" port="0x14"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="6" port="0x15"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="7" port="0x16"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="8" port="0x17"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="9" port="0x18"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="10" port="0x19"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="11" port="0x1a"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="12" port="0x1b"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="13" port="0x1c"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="14" port="0x1d"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="15" port="0x1e"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="16" port="0x1f"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="17" port="0x20"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="18" port="0x21"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="19" port="0x22"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="20" port="0x23"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="21" port="0x24"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="22" port="0x25"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="23" port="0x26"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="24" port="0x27"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-root-port"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target chassis="25" port="0x28"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model name="pcie-pci-bridge"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <controller type="sata" index="0">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </controller>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <interface type="ethernet"><mac address="fa:16:3e:d1:af:d0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap697e2ff1-39"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </interface><serial type="pty">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/console.log" append="off"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target type="isa-serial" port="0">
Dec 03 00:26:40 compute-0 nova_compute[187243]:         <model name="isa-serial"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       </target>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </serial>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <console type="pty">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <log file="/var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0/console.log" append="off"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <target type="serial" port="0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </console>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <input type="tablet" bus="usb">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="usb" bus="0" port="1"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </input>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <input type="mouse" bus="ps2"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <listen type="address" address="::"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </graphics>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <video>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <model type="virtio" heads="1" primary="yes"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </video>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <stats period="10"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </memballoon>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     <rng model="virtio">
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <backend model="random">/dev/urandom</backend>
Dec 03 00:26:40 compute-0 nova_compute[187243]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]:     </rng>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   </devices>
Dec 03 00:26:40 compute-0 nova_compute[187243]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 03 00:26:40 compute-0 nova_compute[187243]: </domain>
Dec 03 00:26:40 compute-0 nova_compute[187243]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.579 187247 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.694 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.695 187247 INFO nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.754 187247 DEBUG oslo_concurrency.lockutils [req-57485c65-9ec9-44e7-a268-b0a8c1ecb73d req-f9535dc9-0983-4b56-a911-4b206e12ea2f 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Releasing lock "refresh_cache-20d06540-44a6-4c4c-ab2f-d4997af86fa0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 03 00:26:40 compute-0 nova_compute[187243]: 2025-12-03 00:26:40.805 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:41 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:41.572 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:41 compute-0 nova_compute[187243]: 2025-12-03 00:26:41.714 187247 INFO nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 03 00:26:42 compute-0 nova_compute[187243]: 2025-12-03 00:26:42.217 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:26:42 compute-0 nova_compute[187243]: 2025-12-03 00:26:42.218 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 03 00:26:42 compute-0 nova_compute[187243]: 2025-12-03 00:26:42.721 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:26:42 compute-0 nova_compute[187243]: 2025-12-03 00:26:42.721 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 03 00:26:43 compute-0 nova_compute[187243]: 2025-12-03 00:26:43.224 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:26:43 compute-0 nova_compute[187243]: 2025-12-03 00:26:43.224 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 03 00:26:43 compute-0 nova_compute[187243]: 2025-12-03 00:26:43.729 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:26:43 compute-0 nova_compute[187243]: 2025-12-03 00:26:43.730 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 03 00:26:44 compute-0 nova_compute[187243]: 2025-12-03 00:26:44.235 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 03 00:26:44 compute-0 nova_compute[187243]: 2025-12-03 00:26:44.236 187247 DEBUG nova.virt.libvirt.migration [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Dec 03 00:26:44 compute-0 kernel: tap697e2ff1-39 (unregistering): left promiscuous mode
Dec 03 00:26:44 compute-0 NetworkManager[55671]: <info>  [1764721604.2473] device (tap697e2ff1-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 03 00:26:44 compute-0 ovn_controller[95488]: 2025-12-03T00:26:44Z|00242|binding|INFO|Releasing lport 697e2ff1-393b-4c81-abc1-b7afc93f0e5b from this chassis (sb_readonly=0)
Dec 03 00:26:44 compute-0 ovn_controller[95488]: 2025-12-03T00:26:44Z|00243|binding|INFO|Setting lport 697e2ff1-393b-4c81-abc1-b7afc93f0e5b down in Southbound
Dec 03 00:26:44 compute-0 nova_compute[187243]: 2025-12-03 00:26:44.250 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:44 compute-0 ovn_controller[95488]: 2025-12-03T00:26:44Z|00244|binding|INFO|Removing iface tap697e2ff1-39 ovn-installed in OVS
Dec 03 00:26:44 compute-0 nova_compute[187243]: 2025-12-03 00:26:44.252 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:44 compute-0 nova_compute[187243]: 2025-12-03 00:26:44.267 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:44 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000020.scope: Deactivated successfully.
Dec 03 00:26:44 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000020.scope: Consumed 15.182s CPU time.
Dec 03 00:26:44 compute-0 systemd-machined[153518]: Machine qemu-22-instance-00000020 terminated.
Dec 03 00:26:44 compute-0 nova_compute[187243]: 2025-12-03 00:26:44.486 187247 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Dec 03 00:26:44 compute-0 nova_compute[187243]: 2025-12-03 00:26:44.487 187247 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Dec 03 00:26:44 compute-0 nova_compute[187243]: 2025-12-03 00:26:44.487 187247 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 03 00:26:44 compute-0 nova_compute[187243]: 2025-12-03 00:26:44.738 187247 DEBUG nova.virt.libvirt.guest [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '20d06540-44a6-4c4c-ab2f-d4997af86fa0' (instance-00000020) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 03 00:26:44 compute-0 nova_compute[187243]: 2025-12-03 00:26:44.738 187247 INFO nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Migration operation has completed
Dec 03 00:26:44 compute-0 nova_compute[187243]: 2025-12-03 00:26:44.738 187247 INFO nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] _post_live_migration() is started..
Dec 03 00:26:45 compute-0 ovn_controller[95488]: 2025-12-03T00:26:45Z|00245|binding|INFO|Releasing lport 45446e36-d2c9-4ea6-b9fb-83e2711350dd from this chassis (sb_readonly=0)
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.049 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:af:d0 10.100.0.11'], port_security=['fa:16:3e:d1:af:d0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e895a64d-10b7-4a6e-a7ff-0745f1562623'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '20d06540-44a6-4c4c-ab2f-d4997af86fa0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8545a5c94f84697a8605fadf08251f7', 'neutron:revision_number': '10', 'neutron:security_group_ids': '85b55f5e-0cbc-47d6-baaa-5c5f70692f0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c447000-beb4-4b86-8116-0ff3837374dd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>], logical_port=697e2ff1-393b-4c81-abc1-b7afc93f0e5b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f8e26ea1b50>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.050 104379 INFO neutron.agent.ovn.metadata.agent [-] Port 697e2ff1-393b-4c81-abc1-b7afc93f0e5b in datapath f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 unbound from our chassis
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.051 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.052 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d3352bfe-b810-4af3-b8c4-05274154b492]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.053 104379 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 namespace which is not needed anymore
Dec 03 00:26:45 compute-0 nova_compute[187243]: 2025-12-03 00:26:45.071 187247 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:45 compute-0 nova_compute[187243]: 2025-12-03 00:26:45.071 187247 WARNING neutronclient.v2_0.client [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 03 00:26:45 compute-0 nova_compute[187243]: 2025-12-03 00:26:45.126 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:45 compute-0 nova_compute[187243]: 2025-12-03 00:26:45.204 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:45 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[222591]: [NOTICE]   (222595) : haproxy version is 3.0.5-8e879a5
Dec 03 00:26:45 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[222591]: [NOTICE]   (222595) : path to executable is /usr/sbin/haproxy
Dec 03 00:26:45 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[222591]: [WARNING]  (222595) : Exiting Master process...
Dec 03 00:26:45 compute-0 podman[222921]: 2025-12-03 00:26:45.222496263 +0000 UTC m=+0.072850603 container kill 09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:26:45 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[222591]: [ALERT]    (222595) : Current worker (222597) exited with code 143 (Terminated)
Dec 03 00:26:45 compute-0 neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2[222591]: [WARNING]  (222595) : All workers exited. Exiting... (0)
Dec 03 00:26:45 compute-0 systemd[1]: libpod-09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21.scope: Deactivated successfully.
Dec 03 00:26:45 compute-0 podman[222935]: 2025-12-03 00:26:45.263608198 +0000 UTC m=+0.020962208 container died 09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:26:45 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21-userdata-shm.mount: Deactivated successfully.
Dec 03 00:26:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-4375adf7302661f64d0afaa53c100a1d93af8d423083608d11a8c9a5215ce3bd-merged.mount: Deactivated successfully.
Dec 03 00:26:45 compute-0 podman[222935]: 2025-12-03 00:26:45.298566712 +0000 UTC m=+0.055920722 container cleanup 09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 03 00:26:45 compute-0 systemd[1]: libpod-conmon-09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21.scope: Deactivated successfully.
Dec 03 00:26:45 compute-0 podman[222937]: 2025-12-03 00:26:45.316891855 +0000 UTC m=+0.070184535 container remove 09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.331 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c6a2a6-969c-4c34-8f9f-86f4f25d33d0]: (4, ("Wed Dec  3 12:26:45 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 (09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21)\n09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21\nWed Dec  3 12:26:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 (09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21)\n09dcca957f25d6627f1f88e7dd54fd4def7b7a344adf62bc7907246c763eab21\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.332 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[aa9f0455-7e6b-4326-91c8-4dec140bdd96]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.332 104379 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7a76663-52a3-4e8c-af8a-8ef26c8fecf2.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.332 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ca523070-b130-495d-b01f-ecac0552191a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.333 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a76663-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:45 compute-0 nova_compute[187243]: 2025-12-03 00:26:45.334 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:45 compute-0 kernel: tapf7a76663-50: left promiscuous mode
Dec 03 00:26:45 compute-0 nova_compute[187243]: 2025-12-03 00:26:45.349 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.352 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[68f01b6e-4b42-4697-a149-e92964933c36]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.370 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f14711a8-5989-4608-906f-45ca5cff43a3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.372 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[90238d1c-c0fe-4208-ade1-4b14bc33b623]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.386 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ccb34d62-9d71-47f1-845e-2b5f4a1ad690]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562914, 'reachable_time': 23299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222969, 'error': None, 'target': 'ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.388 104499 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f7a76663-52a3-4e8c-af8a-8ef26c8fecf2 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 03 00:26:45 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:26:45.388 104499 DEBUG oslo.privsep.daemon [-] privsep: reply[f3b3cac4-e4db-429e-b81e-a1158de813e6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:26:45 compute-0 systemd[1]: run-netns-ovnmeta\x2df7a76663\x2d52a3\x2d4e8c\x2daf8a\x2d8ef26c8fecf2.mount: Deactivated successfully.
Dec 03 00:26:45 compute-0 nova_compute[187243]: 2025-12-03 00:26:45.807 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:45 compute-0 nova_compute[187243]: 2025-12-03 00:26:45.908 187247 DEBUG nova.compute.manager [req-3fa204f4-3c6c-49af-9d51-3efa503d8ee2 req-51df0c1f-600a-470f-af68-32f1b6301b7b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:26:45 compute-0 nova_compute[187243]: 2025-12-03 00:26:45.909 187247 DEBUG oslo_concurrency.lockutils [req-3fa204f4-3c6c-49af-9d51-3efa503d8ee2 req-51df0c1f-600a-470f-af68-32f1b6301b7b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:45 compute-0 nova_compute[187243]: 2025-12-03 00:26:45.909 187247 DEBUG oslo_concurrency.lockutils [req-3fa204f4-3c6c-49af-9d51-3efa503d8ee2 req-51df0c1f-600a-470f-af68-32f1b6301b7b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:45 compute-0 nova_compute[187243]: 2025-12-03 00:26:45.909 187247 DEBUG oslo_concurrency.lockutils [req-3fa204f4-3c6c-49af-9d51-3efa503d8ee2 req-51df0c1f-600a-470f-af68-32f1b6301b7b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:45 compute-0 nova_compute[187243]: 2025-12-03 00:26:45.909 187247 DEBUG nova.compute.manager [req-3fa204f4-3c6c-49af-9d51-3efa503d8ee2 req-51df0c1f-600a-470f-af68-32f1b6301b7b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] No waiting events found dispatching network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:26:45 compute-0 nova_compute[187243]: 2025-12-03 00:26:45.910 187247 DEBUG nova.compute.manager [req-3fa204f4-3c6c-49af-9d51-3efa503d8ee2 req-51df0c1f-600a-470f-af68-32f1b6301b7b 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.270 187247 DEBUG nova.network.neutron [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Activated binding for port 697e2ff1-393b-4c81-abc1-b7afc93f0e5b and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.270 187247 DEBUG nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.271 187247 DEBUG nova.virt.libvirt.vif [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-03T00:25:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1695178384',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1695178384',id=32,image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T00:25:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a8545a5c94f84697a8605fadf08251f7',ramdisk_id='',reservation_id='r-sxcn790n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,manager,reader',image_base_image_ref='92e79321-71af-44a0-869c-1d5a9da5fefc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-558903593',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-558903593-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T00:26:12Z,user_data=None,user_id='43c8524f2d244e8aa3019dd878dcfb81',uuid=20d06540-44a6-4c4c-ab2f-d4997af86fa0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.271 187247 DEBUG nova.network.os_vif_util [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converting VIF {"id": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "address": "fa:16:3e:d1:af:d0", "network": {"id": "f7a76663-52a3-4e8c-af8a-8ef26c8fecf2", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2141975601-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "714680a21a7947948f824493a7b261e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap697e2ff1-39", "ovs_interfaceid": "697e2ff1-393b-4c81-abc1-b7afc93f0e5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.272 187247 DEBUG nova.network.os_vif_util [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.272 187247 DEBUG os_vif [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.274 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.274 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap697e2ff1-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.275 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.278 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.279 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.279 187247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=58db511f-ee8b-4eef-b32d-0cd6482eb4b3) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.280 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.281 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.283 187247 INFO os_vif [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:af:d0,bridge_name='br-int',has_traffic_filtering=True,id=697e2ff1-393b-4c81-abc1-b7afc93f0e5b,network=Network(f7a76663-52a3-4e8c-af8a-8ef26c8fecf2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap697e2ff1-39')
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.283 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.284 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.284 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.284 187247 DEBUG nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.284 187247 INFO nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Deleting instance files /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0_del
Dec 03 00:26:47 compute-0 nova_compute[187243]: 2025-12-03 00:26:47.285 187247 INFO nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Deletion of /var/lib/nova/instances/20d06540-44a6-4c4c-ab2f-d4997af86fa0_del complete
Dec 03 00:26:48 compute-0 podman[222970]: 2025-12-03 00:26:48.098894099 +0000 UTC m=+0.051930704 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.306 187247 DEBUG nova.compute.manager [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.306 187247 DEBUG oslo_concurrency.lockutils [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.307 187247 DEBUG oslo_concurrency.lockutils [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.307 187247 DEBUG oslo_concurrency.lockutils [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.308 187247 DEBUG nova.compute.manager [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] No waiting events found dispatching network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.308 187247 WARNING nova.compute.manager [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received unexpected event network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b for instance with vm_state active and task_state migrating.
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.308 187247 DEBUG nova.compute.manager [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.309 187247 DEBUG oslo_concurrency.lockutils [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.309 187247 DEBUG oslo_concurrency.lockutils [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.310 187247 DEBUG oslo_concurrency.lockutils [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.310 187247 DEBUG nova.compute.manager [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] No waiting events found dispatching network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.310 187247 DEBUG nova.compute.manager [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-unplugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.311 187247 DEBUG nova.compute.manager [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received event network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.311 187247 DEBUG oslo_concurrency.lockutils [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.312 187247 DEBUG oslo_concurrency.lockutils [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.312 187247 DEBUG oslo_concurrency.lockutils [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.312 187247 DEBUG nova.compute.manager [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] No waiting events found dispatching network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 03 00:26:48 compute-0 nova_compute[187243]: 2025-12-03 00:26:48.312 187247 WARNING nova.compute.manager [req-485e73b3-77c0-4284-b8d8-f648fa6992f0 req-9b04914e-561f-491e-bb4a-079b8c1d4dea 9ae25b0e0ab445c2bc5b1f71a7cc08e3 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Received unexpected event network-vif-plugged-697e2ff1-393b-4c81-abc1-b7afc93f0e5b for instance with vm_state active and task_state migrating.
Dec 03 00:26:50 compute-0 nova_compute[187243]: 2025-12-03 00:26:50.391 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:51 compute-0 sshd-session[222994]: Invalid user frappe from 20.123.120.169 port 55980
Dec 03 00:26:52 compute-0 podman[222996]: 2025-12-03 00:26:52.070472703 +0000 UTC m=+0.079881845 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Dec 03 00:26:52 compute-0 podman[222997]: 2025-12-03 00:26:52.074238256 +0000 UTC m=+0.081571347 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:26:52 compute-0 sshd-session[222994]: Received disconnect from 20.123.120.169 port 55980:11: Bye Bye [preauth]
Dec 03 00:26:52 compute-0 sshd-session[222994]: Disconnected from invalid user frappe 20.123.120.169 port 55980 [preauth]
Dec 03 00:26:52 compute-0 nova_compute[187243]: 2025-12-03 00:26:52.280 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:54 compute-0 nova_compute[187243]: 2025-12-03 00:26:54.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:55 compute-0 nova_compute[187243]: 2025-12-03 00:26:55.392 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:55 compute-0 nova_compute[187243]: 2025-12-03 00:26:55.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:57 compute-0 nova_compute[187243]: 2025-12-03 00:26:57.282 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:26:57 compute-0 nova_compute[187243]: 2025-12-03 00:26:57.855 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:57 compute-0 nova_compute[187243]: 2025-12-03 00:26:57.856 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:57 compute-0 nova_compute[187243]: 2025-12-03 00:26:57.856 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "20d06540-44a6-4c4c-ab2f-d4997af86fa0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:58 compute-0 nova_compute[187243]: 2025-12-03 00:26:58.372 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:58 compute-0 nova_compute[187243]: 2025-12-03 00:26:58.372 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:58 compute-0 nova_compute[187243]: 2025-12-03 00:26:58.373 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:26:58 compute-0 nova_compute[187243]: 2025-12-03 00:26:58.373 187247 DEBUG nova.compute.resource_tracker [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:26:58 compute-0 nova_compute[187243]: 2025-12-03 00:26:58.516 187247 WARNING nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:26:58 compute-0 nova_compute[187243]: 2025-12-03 00:26:58.518 187247 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:26:58 compute-0 nova_compute[187243]: 2025-12-03 00:26:58.537 187247 DEBUG oslo_concurrency.processutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:26:58 compute-0 nova_compute[187243]: 2025-12-03 00:26:58.539 187247 DEBUG nova.compute.resource_tracker [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5826MB free_disk=73.16167068481445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:26:58 compute-0 nova_compute[187243]: 2025-12-03 00:26:58.539 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:26:58 compute-0 nova_compute[187243]: 2025-12-03 00:26:58.539 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:26:59 compute-0 nova_compute[187243]: 2025-12-03 00:26:59.563 187247 DEBUG nova.compute.resource_tracker [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration for instance 20d06540-44a6-4c4c-ab2f-d4997af86fa0 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 03 00:26:59 compute-0 nova_compute[187243]: 2025-12-03 00:26:59.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:26:59 compute-0 nova_compute[187243]: 2025-12-03 00:26:59.591 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:26:59 compute-0 podman[197600]: time="2025-12-03T00:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:26:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:26:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2607 "" "Go-http-client/1.1"
Dec 03 00:27:00 compute-0 nova_compute[187243]: 2025-12-03 00:27:00.072 187247 DEBUG nova.compute.resource_tracker [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 03 00:27:00 compute-0 nova_compute[187243]: 2025-12-03 00:27:00.131 187247 DEBUG nova.compute.resource_tracker [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Migration 469ff7b3-8c58-4bf8-aaf7-aa9867e5c0b7 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 03 00:27:00 compute-0 nova_compute[187243]: 2025-12-03 00:27:00.131 187247 DEBUG nova.compute.resource_tracker [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:27:00 compute-0 nova_compute[187243]: 2025-12-03 00:27:00.132 187247 DEBUG nova.compute.resource_tracker [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:26:58 up  1:35,  0 user,  load average: 0.21, 0.26, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:27:00 compute-0 nova_compute[187243]: 2025-12-03 00:27:00.181 187247 DEBUG nova.compute.provider_tree [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:27:00 compute-0 nova_compute[187243]: 2025-12-03 00:27:00.393 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:00 compute-0 nova_compute[187243]: 2025-12-03 00:27:00.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:00 compute-0 nova_compute[187243]: 2025-12-03 00:27:00.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:00 compute-0 nova_compute[187243]: 2025-12-03 00:27:00.690 187247 DEBUG nova.scheduler.client.report [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:27:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:00.730 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:00.730 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:00.730 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:01 compute-0 nova_compute[187243]: 2025-12-03 00:27:01.104 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:01 compute-0 nova_compute[187243]: 2025-12-03 00:27:01.203 187247 DEBUG nova.compute.resource_tracker [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:27:01 compute-0 nova_compute[187243]: 2025-12-03 00:27:01.204 187247 DEBUG oslo_concurrency.lockutils [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.664s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:01 compute-0 nova_compute[187243]: 2025-12-03 00:27:01.209 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.105s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:01 compute-0 nova_compute[187243]: 2025-12-03 00:27:01.210 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:01 compute-0 nova_compute[187243]: 2025-12-03 00:27:01.210 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:27:01 compute-0 nova_compute[187243]: 2025-12-03 00:27:01.233 187247 INFO nova.compute.manager [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Dec 03 00:27:01 compute-0 nova_compute[187243]: 2025-12-03 00:27:01.367 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:27:01 compute-0 nova_compute[187243]: 2025-12-03 00:27:01.368 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:27:01 compute-0 nova_compute[187243]: 2025-12-03 00:27:01.384 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:27:01 compute-0 nova_compute[187243]: 2025-12-03 00:27:01.385 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5829MB free_disk=73.16167068481445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:27:01 compute-0 nova_compute[187243]: 2025-12-03 00:27:01.385 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:27:01 compute-0 nova_compute[187243]: 2025-12-03 00:27:01.386 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:27:01 compute-0 openstack_network_exporter[199746]: ERROR   00:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:27:01 compute-0 openstack_network_exporter[199746]: ERROR   00:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:27:01 compute-0 openstack_network_exporter[199746]: ERROR   00:27:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:27:01 compute-0 openstack_network_exporter[199746]: ERROR   00:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:27:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:27:01 compute-0 openstack_network_exporter[199746]: ERROR   00:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:27:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:27:02 compute-0 nova_compute[187243]: 2025-12-03 00:27:02.284 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:03 compute-0 nova_compute[187243]: 2025-12-03 00:27:03.168 187247 INFO nova.scheduler.client.report [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] Deleted allocation for migration 469ff7b3-8c58-4bf8-aaf7-aa9867e5c0b7
Dec 03 00:27:03 compute-0 nova_compute[187243]: 2025-12-03 00:27:03.169 187247 DEBUG nova.virt.libvirt.driver [None req-3acf3c51-6ad4-42cd-b5cf-f36baaeb1cd3 6660e26e0b5b4c4aaf109b9bf2041810 1ca6deab53154b69a9bb7cede3b4778b - - default default] [instance: 20d06540-44a6-4c4c-ab2f-d4997af86fa0] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 03 00:27:03 compute-0 nova_compute[187243]: 2025-12-03 00:27:03.668 187247 INFO nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Instance 469ff7b3-8c58-4bf8-aaf7-aa9867e5c0b7 has allocations against this compute host but is not found in the database.
Dec 03 00:27:03 compute-0 nova_compute[187243]: 2025-12-03 00:27:03.668 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:27:03 compute-0 nova_compute[187243]: 2025-12-03 00:27:03.669 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:27:01 up  1:35,  0 user,  load average: 0.19, 0.25, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:27:03 compute-0 nova_compute[187243]: 2025-12-03 00:27:03.690 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:27:04 compute-0 nova_compute[187243]: 2025-12-03 00:27:04.197 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:27:04 compute-0 nova_compute[187243]: 2025-12-03 00:27:04.708 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:27:04 compute-0 nova_compute[187243]: 2025-12-03 00:27:04.708 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.323s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:27:05 compute-0 nova_compute[187243]: 2025-12-03 00:27:05.396 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:06 compute-0 nova_compute[187243]: 2025-12-03 00:27:06.705 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:06 compute-0 nova_compute[187243]: 2025-12-03 00:27:06.705 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:07 compute-0 nova_compute[187243]: 2025-12-03 00:27:07.288 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:07 compute-0 nova_compute[187243]: 2025-12-03 00:27:07.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:08 compute-0 podman[223041]: 2025-12-03 00:27:08.091271394 +0000 UTC m=+0.049205497 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:27:08 compute-0 podman[223042]: 2025-12-03 00:27:08.102297216 +0000 UTC m=+0.056433516 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, release=1755695350)
Dec 03 00:27:10 compute-0 nova_compute[187243]: 2025-12-03 00:27:10.397 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:12 compute-0 nova_compute[187243]: 2025-12-03 00:27:12.290 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:15 compute-0 nova_compute[187243]: 2025-12-03 00:27:15.400 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:16 compute-0 nova_compute[187243]: 2025-12-03 00:27:16.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:17 compute-0 nova_compute[187243]: 2025-12-03 00:27:17.293 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:19 compute-0 podman[223085]: 2025-12-03 00:27:19.101504091 +0000 UTC m=+0.047056124 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:27:20 compute-0 sshd-session[223086]: Invalid user adam from 61.220.235.10 port 50610
Dec 03 00:27:20 compute-0 sshd-session[223086]: Received disconnect from 61.220.235.10 port 50610:11: Bye Bye [preauth]
Dec 03 00:27:20 compute-0 sshd-session[223086]: Disconnected from invalid user adam 61.220.235.10 port 50610 [preauth]
Dec 03 00:27:20 compute-0 nova_compute[187243]: 2025-12-03 00:27:20.401 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:22 compute-0 nova_compute[187243]: 2025-12-03 00:27:22.295 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:23 compute-0 podman[223112]: 2025-12-03 00:27:23.13037099 +0000 UTC m=+0.083783742 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 03 00:27:23 compute-0 podman[223113]: 2025-12-03 00:27:23.147090693 +0000 UTC m=+0.100468714 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Dec 03 00:27:25 compute-0 nova_compute[187243]: 2025-12-03 00:27:25.404 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:27 compute-0 nova_compute[187243]: 2025-12-03 00:27:27.297 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:29 compute-0 podman[197600]: time="2025-12-03T00:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:27:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:27:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Dec 03 00:27:30 compute-0 nova_compute[187243]: 2025-12-03 00:27:30.406 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:31 compute-0 openstack_network_exporter[199746]: ERROR   00:27:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:27:31 compute-0 openstack_network_exporter[199746]: ERROR   00:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:27:31 compute-0 openstack_network_exporter[199746]: ERROR   00:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:27:31 compute-0 openstack_network_exporter[199746]: ERROR   00:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:27:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:27:31 compute-0 openstack_network_exporter[199746]: ERROR   00:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:27:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:27:32 compute-0 nova_compute[187243]: 2025-12-03 00:27:32.299 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:32 compute-0 sshd-session[223154]: Invalid user sipv from 101.47.140.127 port 37862
Dec 03 00:27:33 compute-0 sshd-session[223154]: Received disconnect from 101.47.140.127 port 37862:11: Bye Bye [preauth]
Dec 03 00:27:33 compute-0 sshd-session[223154]: Disconnected from invalid user sipv 101.47.140.127 port 37862 [preauth]
Dec 03 00:27:35 compute-0 nova_compute[187243]: 2025-12-03 00:27:35.408 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:37 compute-0 nova_compute[187243]: 2025-12-03 00:27:37.301 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:37 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:37.527 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:27:37 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:37.527 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:27:37 compute-0 nova_compute[187243]: 2025-12-03 00:27:37.557 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:39 compute-0 podman[223157]: 2025-12-03 00:27:39.102498503 +0000 UTC m=+0.054488018 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 03 00:27:39 compute-0 podman[223158]: 2025-12-03 00:27:39.134409771 +0000 UTC m=+0.081907985 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 03 00:27:39 compute-0 sshd-session[223195]: Invalid user andy from 23.95.37.90 port 37428
Dec 03 00:27:39 compute-0 sshd-session[223195]: Received disconnect from 23.95.37.90 port 37428:11: Bye Bye [preauth]
Dec 03 00:27:39 compute-0 sshd-session[223195]: Disconnected from invalid user andy 23.95.37.90 port 37428 [preauth]
Dec 03 00:27:40 compute-0 nova_compute[187243]: 2025-12-03 00:27:40.409 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:42 compute-0 nova_compute[187243]: 2025-12-03 00:27:42.303 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:42 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:42.529 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:27:45 compute-0 nova_compute[187243]: 2025-12-03 00:27:45.412 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:47 compute-0 nova_compute[187243]: 2025-12-03 00:27:47.305 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:48.097 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:2e:d3 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47c9dea6-51f8-4918-b7de-0893eb139352', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5bcb6274878430cbf268fcd97e3d9d5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41d502de-899a-45f5-a018-49c03d644872, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=aeb951c1-76c1-4a80-a37e-114fc110daf0) old=Port_Binding(mac=['fa:16:3e:5f:2e:d3'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-47c9dea6-51f8-4918-b7de-0893eb139352', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47c9dea6-51f8-4918-b7de-0893eb139352', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5bcb6274878430cbf268fcd97e3d9d5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:27:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:48.099 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port aeb951c1-76c1-4a80-a37e-114fc110daf0 in datapath 47c9dea6-51f8-4918-b7de-0893eb139352 updated
Dec 03 00:27:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:48.101 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 47c9dea6-51f8-4918-b7de-0893eb139352, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:27:48 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:48.102 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2c3bef58-1dae-4423-8832-31007701300c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:50 compute-0 podman[223198]: 2025-12-03 00:27:50.09127203 +0000 UTC m=+0.049423613 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:27:50 compute-0 nova_compute[187243]: 2025-12-03 00:27:50.414 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:52 compute-0 nova_compute[187243]: 2025-12-03 00:27:52.306 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:54 compute-0 podman[223224]: 2025-12-03 00:27:54.101327586 +0000 UTC m=+0.055907113 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202)
Dec 03 00:27:54 compute-0 podman[223225]: 2025-12-03 00:27:54.165538693 +0000 UTC m=+0.117370482 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 00:27:54 compute-0 nova_compute[187243]: 2025-12-03 00:27:54.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:55 compute-0 nova_compute[187243]: 2025-12-03 00:27:55.415 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:56 compute-0 nova_compute[187243]: 2025-12-03 00:27:56.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:57 compute-0 nova_compute[187243]: 2025-12-03 00:27:57.308 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:27:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:58.413 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:3f:d2 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2f8079be-7802-4cbd-9c9c-c0cb589fc871', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f8079be-7802-4cbd-9c9c-c0cb589fc871', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '079699d388d64224949dbfaf77fa93bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3a59f07-3b05-4b11-8a33-06a5d4e97331, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0b93d847-3f88-4d4a-9d9c-eebaaf22c0f2) old=Port_Binding(mac=['fa:16:3e:e2:3f:d2'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-2f8079be-7802-4cbd-9c9c-c0cb589fc871', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f8079be-7802-4cbd-9c9c-c0cb589fc871', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '079699d388d64224949dbfaf77fa93bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:27:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:58.413 104379 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0b93d847-3f88-4d4a-9d9c-eebaaf22c0f2 in datapath 2f8079be-7802-4cbd-9c9c-c0cb589fc871 updated
Dec 03 00:27:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:58.415 104379 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2f8079be-7802-4cbd-9c9c-c0cb589fc871, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 03 00:27:58 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:27:58.415 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9101ad02-748f-4d58-932c-fa60d6b53025]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 03 00:27:59 compute-0 nova_compute[187243]: 2025-12-03 00:27:59.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:27:59 compute-0 nova_compute[187243]: 2025-12-03 00:27:59.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:27:59 compute-0 podman[197600]: time="2025-12-03T00:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:27:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:27:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2607 "" "Go-http-client/1.1"
Dec 03 00:28:00 compute-0 sshd-session[223272]: Invalid user user from 78.128.112.74 port 47378
Dec 03 00:28:00 compute-0 sshd-session[223272]: Connection closed by invalid user user 78.128.112.74 port 47378 [preauth]
Dec 03 00:28:00 compute-0 nova_compute[187243]: 2025-12-03 00:28:00.418 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:00 compute-0 nova_compute[187243]: 2025-12-03 00:28:00.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:28:00.731 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:28:00.731 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:28:00.731 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:01 compute-0 sshd-session[223270]: Invalid user system from 45.78.222.160 port 43230
Dec 03 00:28:01 compute-0 sshd-session[223270]: Received disconnect from 45.78.222.160 port 43230:11: Bye Bye [preauth]
Dec 03 00:28:01 compute-0 sshd-session[223270]: Disconnected from invalid user system 45.78.222.160 port 43230 [preauth]
Dec 03 00:28:01 compute-0 openstack_network_exporter[199746]: ERROR   00:28:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:28:01 compute-0 openstack_network_exporter[199746]: ERROR   00:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:28:01 compute-0 openstack_network_exporter[199746]: ERROR   00:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:28:01 compute-0 openstack_network_exporter[199746]: ERROR   00:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:28:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:28:01 compute-0 openstack_network_exporter[199746]: ERROR   00:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:28:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:28:02 compute-0 nova_compute[187243]: 2025-12-03 00:28:02.309 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:02 compute-0 nova_compute[187243]: 2025-12-03 00:28:02.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:03 compute-0 nova_compute[187243]: 2025-12-03 00:28:03.102 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:03 compute-0 nova_compute[187243]: 2025-12-03 00:28:03.102 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:03 compute-0 nova_compute[187243]: 2025-12-03 00:28:03.103 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:03 compute-0 nova_compute[187243]: 2025-12-03 00:28:03.103 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:28:03 compute-0 nova_compute[187243]: 2025-12-03 00:28:03.257 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:28:03 compute-0 nova_compute[187243]: 2025-12-03 00:28:03.258 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:28:03 compute-0 nova_compute[187243]: 2025-12-03 00:28:03.276 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:28:03 compute-0 nova_compute[187243]: 2025-12-03 00:28:03.277 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5835MB free_disk=73.16159057617188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:28:03 compute-0 nova_compute[187243]: 2025-12-03 00:28:03.277 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:28:03 compute-0 nova_compute[187243]: 2025-12-03 00:28:03.277 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:28:04 compute-0 nova_compute[187243]: 2025-12-03 00:28:04.333 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:28:04 compute-0 nova_compute[187243]: 2025-12-03 00:28:04.334 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:28:03 up  1:36,  0 user,  load average: 0.14, 0.22, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:28:04 compute-0 nova_compute[187243]: 2025-12-03 00:28:04.420 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing inventories for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:28:04 compute-0 nova_compute[187243]: 2025-12-03 00:28:04.436 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating ProviderTree inventory for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:28:04 compute-0 nova_compute[187243]: 2025-12-03 00:28:04.437 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:28:04 compute-0 nova_compute[187243]: 2025-12-03 00:28:04.453 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing aggregate associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:28:04 compute-0 nova_compute[187243]: 2025-12-03 00:28:04.477 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing trait associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_SOUND_MODEL_AC97,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOUND_MODEL_ICH9,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:28:04 compute-0 nova_compute[187243]: 2025-12-03 00:28:04.497 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:28:05 compute-0 nova_compute[187243]: 2025-12-03 00:28:05.003 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:28:05 compute-0 nova_compute[187243]: 2025-12-03 00:28:05.419 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:05 compute-0 nova_compute[187243]: 2025-12-03 00:28:05.512 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:28:05 compute-0 nova_compute[187243]: 2025-12-03 00:28:05.513 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.235s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:28:05 compute-0 nova_compute[187243]: 2025-12-03 00:28:05.513 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:05 compute-0 nova_compute[187243]: 2025-12-03 00:28:05.513 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:28:06 compute-0 nova_compute[187243]: 2025-12-03 00:28:06.019 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:06 compute-0 sshd-session[223268]: Received disconnect from 45.78.219.213 port 56418:11: Bye Bye [preauth]
Dec 03 00:28:06 compute-0 sshd-session[223268]: Disconnected from 45.78.219.213 port 56418 [preauth]
Dec 03 00:28:07 compute-0 nova_compute[187243]: 2025-12-03 00:28:07.311 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:07 compute-0 nova_compute[187243]: 2025-12-03 00:28:07.523 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:07 compute-0 nova_compute[187243]: 2025-12-03 00:28:07.524 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:07 compute-0 nova_compute[187243]: 2025-12-03 00:28:07.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:08 compute-0 ovn_controller[95488]: 2025-12-03T00:28:08Z|00246|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Dec 03 00:28:10 compute-0 podman[223279]: 2025-12-03 00:28:10.110706621 +0000 UTC m=+0.062420234 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 03 00:28:10 compute-0 podman[223278]: 2025-12-03 00:28:10.127654809 +0000 UTC m=+0.084648562 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 03 00:28:10 compute-0 nova_compute[187243]: 2025-12-03 00:28:10.436 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:11 compute-0 sshd-session[223276]: Invalid user deploy from 45.78.219.95 port 36162
Dec 03 00:28:12 compute-0 nova_compute[187243]: 2025-12-03 00:28:12.313 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:12 compute-0 sshd-session[223276]: Received disconnect from 45.78.219.95 port 36162:11: Bye Bye [preauth]
Dec 03 00:28:12 compute-0 sshd-session[223276]: Disconnected from invalid user deploy 45.78.219.95 port 36162 [preauth]
Dec 03 00:28:13 compute-0 nova_compute[187243]: 2025-12-03 00:28:13.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:13 compute-0 nova_compute[187243]: 2025-12-03 00:28:13.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:28:14 compute-0 nova_compute[187243]: 2025-12-03 00:28:14.109 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:28:15 compute-0 nova_compute[187243]: 2025-12-03 00:28:15.438 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:17 compute-0 nova_compute[187243]: 2025-12-03 00:28:17.315 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:19 compute-0 sshd-session[223319]: Invalid user zmarin from 20.123.120.169 port 45008
Dec 03 00:28:19 compute-0 sshd-session[223319]: Received disconnect from 20.123.120.169 port 45008:11: Bye Bye [preauth]
Dec 03 00:28:19 compute-0 sshd-session[223319]: Disconnected from invalid user zmarin 20.123.120.169 port 45008 [preauth]
Dec 03 00:28:20 compute-0 nova_compute[187243]: 2025-12-03 00:28:20.440 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:21 compute-0 podman[223321]: 2025-12-03 00:28:21.093419648 +0000 UTC m=+0.048774827 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:28:22 compute-0 nova_compute[187243]: 2025-12-03 00:28:22.317 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:25 compute-0 podman[223346]: 2025-12-03 00:28:25.089358664 +0000 UTC m=+0.040397810 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 03 00:28:25 compute-0 podman[223347]: 2025-12-03 00:28:25.151637563 +0000 UTC m=+0.088795646 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:28:25 compute-0 nova_compute[187243]: 2025-12-03 00:28:25.441 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:27 compute-0 nova_compute[187243]: 2025-12-03 00:28:27.319 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:29 compute-0 podman[197600]: time="2025-12-03T00:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:28:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:28:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2611 "" "Go-http-client/1.1"
Dec 03 00:28:30 compute-0 nova_compute[187243]: 2025-12-03 00:28:30.444 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:31 compute-0 openstack_network_exporter[199746]: ERROR   00:28:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:28:31 compute-0 openstack_network_exporter[199746]: ERROR   00:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:28:31 compute-0 openstack_network_exporter[199746]: ERROR   00:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:28:31 compute-0 openstack_network_exporter[199746]: ERROR   00:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:28:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:28:31 compute-0 openstack_network_exporter[199746]: ERROR   00:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:28:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:28:32 compute-0 nova_compute[187243]: 2025-12-03 00:28:32.321 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:35 compute-0 nova_compute[187243]: 2025-12-03 00:28:35.446 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:37 compute-0 nova_compute[187243]: 2025-12-03 00:28:37.323 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:40 compute-0 nova_compute[187243]: 2025-12-03 00:28:40.446 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:41 compute-0 podman[223394]: 2025-12-03 00:28:41.095313696 +0000 UTC m=+0.051554185 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:28:41 compute-0 podman[223395]: 2025-12-03 00:28:41.098623158 +0000 UTC m=+0.051598026 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Dec 03 00:28:42 compute-0 nova_compute[187243]: 2025-12-03 00:28:42.325 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:43 compute-0 sshd-session[223434]: Received disconnect from 61.220.235.10 port 49788:11: Bye Bye [preauth]
Dec 03 00:28:43 compute-0 sshd-session[223434]: Disconnected from authenticating user root 61.220.235.10 port 49788 [preauth]
Dec 03 00:28:45 compute-0 nova_compute[187243]: 2025-12-03 00:28:45.448 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:47 compute-0 nova_compute[187243]: 2025-12-03 00:28:47.326 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:50 compute-0 nova_compute[187243]: 2025-12-03 00:28:50.451 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:52 compute-0 podman[223436]: 2025-12-03 00:28:52.094332065 +0000 UTC m=+0.053848952 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:28:52 compute-0 nova_compute[187243]: 2025-12-03 00:28:52.328 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:55 compute-0 nova_compute[187243]: 2025-12-03 00:28:55.109 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:55 compute-0 nova_compute[187243]: 2025-12-03 00:28:55.496 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:56 compute-0 podman[223460]: 2025-12-03 00:28:56.125404109 +0000 UTC m=+0.078320526 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Dec 03 00:28:56 compute-0 podman[223461]: 2025-12-03 00:28:56.155508853 +0000 UTC m=+0.106449542 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:28:56 compute-0 nova_compute[187243]: 2025-12-03 00:28:56.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:28:57 compute-0 nova_compute[187243]: 2025-12-03 00:28:57.330 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:28:59 compute-0 sshd-session[223506]: Invalid user tony from 23.95.37.90 port 47810
Dec 03 00:28:59 compute-0 sshd-session[223506]: Received disconnect from 23.95.37.90 port 47810:11: Bye Bye [preauth]
Dec 03 00:28:59 compute-0 sshd-session[223506]: Disconnected from invalid user tony 23.95.37.90 port 47810 [preauth]
Dec 03 00:28:59 compute-0 podman[197600]: time="2025-12-03T00:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:28:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:28:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2605 "" "Go-http-client/1.1"
Dec 03 00:29:00 compute-0 nova_compute[187243]: 2025-12-03 00:29:00.506 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:00 compute-0 nova_compute[187243]: 2025-12-03 00:29:00.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:00 compute-0 nova_compute[187243]: 2025-12-03 00:29:00.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:00 compute-0 nova_compute[187243]: 2025-12-03 00:29:00.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:29:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:29:00.733 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:29:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:29:00.733 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:29:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:29:00.734 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:29:01 compute-0 openstack_network_exporter[199746]: ERROR   00:29:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:29:01 compute-0 openstack_network_exporter[199746]: ERROR   00:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:29:01 compute-0 openstack_network_exporter[199746]: ERROR   00:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:29:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:29:01 compute-0 openstack_network_exporter[199746]: ERROR   00:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:29:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:29:01 compute-0 openstack_network_exporter[199746]: ERROR   00:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:29:02 compute-0 nova_compute[187243]: 2025-12-03 00:29:02.332 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:03 compute-0 nova_compute[187243]: 2025-12-03 00:29:03.587 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:04 compute-0 nova_compute[187243]: 2025-12-03 00:29:04.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:05 compute-0 nova_compute[187243]: 2025-12-03 00:29:05.102 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:29:05 compute-0 nova_compute[187243]: 2025-12-03 00:29:05.103 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:29:05 compute-0 nova_compute[187243]: 2025-12-03 00:29:05.103 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:29:05 compute-0 nova_compute[187243]: 2025-12-03 00:29:05.103 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:29:05 compute-0 nova_compute[187243]: 2025-12-03 00:29:05.233 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:29:05 compute-0 nova_compute[187243]: 2025-12-03 00:29:05.234 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:29:05 compute-0 nova_compute[187243]: 2025-12-03 00:29:05.250 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:29:05 compute-0 nova_compute[187243]: 2025-12-03 00:29:05.250 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5838MB free_disk=73.16160583496094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:29:05 compute-0 nova_compute[187243]: 2025-12-03 00:29:05.251 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:29:05 compute-0 nova_compute[187243]: 2025-12-03 00:29:05.251 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:29:05 compute-0 nova_compute[187243]: 2025-12-03 00:29:05.506 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:06 compute-0 nova_compute[187243]: 2025-12-03 00:29:06.496 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:29:06 compute-0 nova_compute[187243]: 2025-12-03 00:29:06.496 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:29:05 up  1:37,  0 user,  load average: 0.05, 0.18, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:29:06 compute-0 nova_compute[187243]: 2025-12-03 00:29:06.565 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:29:07 compute-0 nova_compute[187243]: 2025-12-03 00:29:07.333 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:07 compute-0 nova_compute[187243]: 2025-12-03 00:29:07.447 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:29:07 compute-0 nova_compute[187243]: 2025-12-03 00:29:07.963 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:29:07 compute-0 nova_compute[187243]: 2025-12-03 00:29:07.963 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.712s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:29:10 compute-0 nova_compute[187243]: 2025-12-03 00:29:10.563 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:10 compute-0 nova_compute[187243]: 2025-12-03 00:29:10.963 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:10 compute-0 nova_compute[187243]: 2025-12-03 00:29:10.963 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:12 compute-0 podman[223510]: 2025-12-03 00:29:12.089592749 +0000 UTC m=+0.050322704 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:29:12 compute-0 podman[223511]: 2025-12-03 00:29:12.08960576 +0000 UTC m=+0.047595288 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:29:12 compute-0 nova_compute[187243]: 2025-12-03 00:29:12.336 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:15 compute-0 nova_compute[187243]: 2025-12-03 00:29:15.566 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:17 compute-0 nova_compute[187243]: 2025-12-03 00:29:17.338 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:17 compute-0 nova_compute[187243]: 2025-12-03 00:29:17.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:20 compute-0 nova_compute[187243]: 2025-12-03 00:29:20.569 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:22 compute-0 podman[223551]: 2025-12-03 00:29:22.228747799 +0000 UTC m=+0.057939593 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:29:22 compute-0 nova_compute[187243]: 2025-12-03 00:29:22.340 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:25 compute-0 nova_compute[187243]: 2025-12-03 00:29:25.613 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:27 compute-0 podman[223577]: 2025-12-03 00:29:27.100811938 +0000 UTC m=+0.054901518 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 03 00:29:27 compute-0 podman[223578]: 2025-12-03 00:29:27.165576778 +0000 UTC m=+0.104213936 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 03 00:29:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:29:27.331 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:29:27 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:29:27.331 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:29:27 compute-0 nova_compute[187243]: 2025-12-03 00:29:27.332 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:27 compute-0 nova_compute[187243]: 2025-12-03 00:29:27.341 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:29 compute-0 podman[197600]: time="2025-12-03T00:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:29:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:29:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2609 "" "Go-http-client/1.1"
Dec 03 00:29:30 compute-0 nova_compute[187243]: 2025-12-03 00:29:30.614 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:31 compute-0 openstack_network_exporter[199746]: ERROR   00:29:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:29:31 compute-0 openstack_network_exporter[199746]: ERROR   00:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:29:31 compute-0 openstack_network_exporter[199746]: ERROR   00:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:29:31 compute-0 openstack_network_exporter[199746]: ERROR   00:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:29:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:29:31 compute-0 openstack_network_exporter[199746]: ERROR   00:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:29:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:29:32 compute-0 nova_compute[187243]: 2025-12-03 00:29:32.343 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:33 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:29:33.332 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:29:35 compute-0 nova_compute[187243]: 2025-12-03 00:29:35.663 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:37 compute-0 nova_compute[187243]: 2025-12-03 00:29:37.345 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:40 compute-0 nova_compute[187243]: 2025-12-03 00:29:40.693 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:42 compute-0 nova_compute[187243]: 2025-12-03 00:29:42.347 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:43 compute-0 podman[223625]: 2025-12-03 00:29:43.111543525 +0000 UTC m=+0.062561477 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:29:43 compute-0 podman[223624]: 2025-12-03 00:29:43.138583833 +0000 UTC m=+0.090044456 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 00:29:45 compute-0 sshd-session[223666]: Invalid user tony from 20.123.120.169 port 43364
Dec 03 00:29:45 compute-0 sshd-session[223666]: Received disconnect from 20.123.120.169 port 43364:11: Bye Bye [preauth]
Dec 03 00:29:45 compute-0 sshd-session[223666]: Disconnected from invalid user tony 20.123.120.169 port 43364 [preauth]
Dec 03 00:29:45 compute-0 nova_compute[187243]: 2025-12-03 00:29:45.748 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:47 compute-0 nova_compute[187243]: 2025-12-03 00:29:47.348 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:49 compute-0 sshd-session[223668]: Connection closed by authenticating user root 80.94.95.115 port 51228 [preauth]
Dec 03 00:29:50 compute-0 nova_compute[187243]: 2025-12-03 00:29:50.776 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:52 compute-0 nova_compute[187243]: 2025-12-03 00:29:52.350 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:53 compute-0 podman[223670]: 2025-12-03 00:29:53.101250882 +0000 UTC m=+0.050923640 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:29:55 compute-0 nova_compute[187243]: 2025-12-03 00:29:55.594 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:55 compute-0 nova_compute[187243]: 2025-12-03 00:29:55.778 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:57 compute-0 nova_compute[187243]: 2025-12-03 00:29:57.352 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:29:58 compute-0 podman[223697]: 2025-12-03 00:29:58.082158889 +0000 UTC m=+0.038601995 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:29:58 compute-0 podman[223698]: 2025-12-03 00:29:58.116515119 +0000 UTC m=+0.069185541 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:29:58 compute-0 nova_compute[187243]: 2025-12-03 00:29:58.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:29:59 compute-0 sshd-session[223695]: Connection closed by 101.47.140.127 port 42316 [preauth]
Dec 03 00:29:59 compute-0 podman[197600]: time="2025-12-03T00:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:29:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:29:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2605 "" "Go-http-client/1.1"
Dec 03 00:30:00 compute-0 nova_compute[187243]: 2025-12-03 00:30:00.594 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:00 compute-0 nova_compute[187243]: 2025-12-03 00:30:00.594 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:00 compute-0 nova_compute[187243]: 2025-12-03 00:30:00.594 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:30:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:30:00.740 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:30:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:30:00.740 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:30:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:30:00.740 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:30:00 compute-0 nova_compute[187243]: 2025-12-03 00:30:00.780 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:01 compute-0 openstack_network_exporter[199746]: ERROR   00:30:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:30:01 compute-0 openstack_network_exporter[199746]: ERROR   00:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:30:01 compute-0 openstack_network_exporter[199746]: ERROR   00:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:30:01 compute-0 openstack_network_exporter[199746]: ERROR   00:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:30:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:30:01 compute-0 openstack_network_exporter[199746]: ERROR   00:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:30:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:30:02 compute-0 nova_compute[187243]: 2025-12-03 00:30:02.354 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:03 compute-0 nova_compute[187243]: 2025-12-03 00:30:03.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:05 compute-0 nova_compute[187243]: 2025-12-03 00:30:05.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:05 compute-0 nova_compute[187243]: 2025-12-03 00:30:05.781 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:06 compute-0 nova_compute[187243]: 2025-12-03 00:30:06.261 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:30:06 compute-0 nova_compute[187243]: 2025-12-03 00:30:06.261 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:30:06 compute-0 nova_compute[187243]: 2025-12-03 00:30:06.261 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:30:06 compute-0 nova_compute[187243]: 2025-12-03 00:30:06.261 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:30:06 compute-0 nova_compute[187243]: 2025-12-03 00:30:06.376 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:30:06 compute-0 nova_compute[187243]: 2025-12-03 00:30:06.377 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:30:06 compute-0 nova_compute[187243]: 2025-12-03 00:30:06.391 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:30:06 compute-0 nova_compute[187243]: 2025-12-03 00:30:06.392 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5844MB free_disk=73.16172409057617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:30:06 compute-0 nova_compute[187243]: 2025-12-03 00:30:06.392 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:30:06 compute-0 nova_compute[187243]: 2025-12-03 00:30:06.392 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:30:07 compute-0 nova_compute[187243]: 2025-12-03 00:30:07.355 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:07 compute-0 nova_compute[187243]: 2025-12-03 00:30:07.589 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:30:07 compute-0 nova_compute[187243]: 2025-12-03 00:30:07.589 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:30:06 up  1:38,  0 user,  load average: 0.01, 0.14, 0.22\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:30:07 compute-0 nova_compute[187243]: 2025-12-03 00:30:07.604 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:30:08 compute-0 nova_compute[187243]: 2025-12-03 00:30:08.266 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:30:08 compute-0 sshd-session[223746]: Invalid user syncthing from 61.220.235.10 port 48942
Dec 03 00:30:08 compute-0 sshd-session[223746]: Received disconnect from 61.220.235.10 port 48942:11: Bye Bye [preauth]
Dec 03 00:30:08 compute-0 sshd-session[223746]: Disconnected from invalid user syncthing 61.220.235.10 port 48942 [preauth]
Dec 03 00:30:08 compute-0 nova_compute[187243]: 2025-12-03 00:30:08.781 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:30:08 compute-0 nova_compute[187243]: 2025-12-03 00:30:08.781 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.389s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:30:10 compute-0 nova_compute[187243]: 2025-12-03 00:30:10.782 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:10 compute-0 nova_compute[187243]: 2025-12-03 00:30:10.782 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:10 compute-0 nova_compute[187243]: 2025-12-03 00:30:10.809 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:12 compute-0 nova_compute[187243]: 2025-12-03 00:30:12.358 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:14 compute-0 podman[223748]: 2025-12-03 00:30:14.089406203 +0000 UTC m=+0.049566286 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 03 00:30:14 compute-0 podman[223749]: 2025-12-03 00:30:14.105373027 +0000 UTC m=+0.054940189 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_id=edpm, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Dec 03 00:30:15 compute-0 nova_compute[187243]: 2025-12-03 00:30:15.865 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:17 compute-0 nova_compute[187243]: 2025-12-03 00:30:17.359 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:20 compute-0 nova_compute[187243]: 2025-12-03 00:30:20.866 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:22 compute-0 nova_compute[187243]: 2025-12-03 00:30:22.361 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:24 compute-0 podman[223787]: 2025-12-03 00:30:24.087503595 +0000 UTC m=+0.045371353 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:30:25 compute-0 nova_compute[187243]: 2025-12-03 00:30:25.868 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:27 compute-0 nova_compute[187243]: 2025-12-03 00:30:27.362 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:29 compute-0 podman[223811]: 2025-12-03 00:30:29.086341296 +0000 UTC m=+0.047806072 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:30:29 compute-0 podman[223812]: 2025-12-03 00:30:29.160326024 +0000 UTC m=+0.118368516 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Dec 03 00:30:29 compute-0 podman[197600]: time="2025-12-03T00:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:30:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:30:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2604 "" "Go-http-client/1.1"
Dec 03 00:30:30 compute-0 nova_compute[187243]: 2025-12-03 00:30:30.870 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:31 compute-0 openstack_network_exporter[199746]: ERROR   00:30:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:30:31 compute-0 openstack_network_exporter[199746]: ERROR   00:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:30:31 compute-0 openstack_network_exporter[199746]: ERROR   00:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:30:31 compute-0 openstack_network_exporter[199746]: ERROR   00:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:30:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:30:31 compute-0 openstack_network_exporter[199746]: ERROR   00:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:30:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:30:32 compute-0 nova_compute[187243]: 2025-12-03 00:30:32.363 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:35 compute-0 nova_compute[187243]: 2025-12-03 00:30:35.909 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:37 compute-0 nova_compute[187243]: 2025-12-03 00:30:37.365 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:40 compute-0 nova_compute[187243]: 2025-12-03 00:30:40.981 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:42 compute-0 nova_compute[187243]: 2025-12-03 00:30:42.366 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:43 compute-0 sshd-session[223856]: Connection closed by 45.78.219.213 port 39340 [preauth]
Dec 03 00:30:45 compute-0 podman[223859]: 2025-12-03 00:30:45.102360404 +0000 UTC m=+0.054386585 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 03 00:30:45 compute-0 podman[223858]: 2025-12-03 00:30:45.119101628 +0000 UTC m=+0.066279459 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:30:45 compute-0 nova_compute[187243]: 2025-12-03 00:30:45.983 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:47 compute-0 sshd-session[223901]: Accepted publickey for zuul from 192.168.122.10 port 59868 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 03 00:30:47 compute-0 systemd-logind[795]: New session 28 of user zuul.
Dec 03 00:30:47 compute-0 systemd[1]: Started Session 28 of User zuul.
Dec 03 00:30:47 compute-0 sshd-session[223901]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 00:30:47 compute-0 sudo[223905]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 03 00:30:47 compute-0 sudo[223905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 00:30:47 compute-0 nova_compute[187243]: 2025-12-03 00:30:47.368 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:51 compute-0 nova_compute[187243]: 2025-12-03 00:30:51.016 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:52 compute-0 nova_compute[187243]: 2025-12-03 00:30:52.369 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:53 compute-0 ovs-vsctl[224116]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 03 00:30:54 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 223929 (sos)
Dec 03 00:30:54 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 03 00:30:54 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 03 00:30:54 compute-0 podman[224166]: 2025-12-03 00:30:54.365499134 +0000 UTC m=+0.092984219 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:30:54 compute-0 virtqemud[186944]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 03 00:30:54 compute-0 virtqemud[186944]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 03 00:30:54 compute-0 virtqemud[186944]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 03 00:30:55 compute-0 crontab[224554]: (root) LIST (root)
Dec 03 00:30:56 compute-0 nova_compute[187243]: 2025-12-03 00:30:56.069 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:56 compute-0 nova_compute[187243]: 2025-12-03 00:30:56.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:57 compute-0 nova_compute[187243]: 2025-12-03 00:30:57.401 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:30:57 compute-0 systemd[1]: Starting Hostname Service...
Dec 03 00:30:57 compute-0 systemd[1]: Started Hostname Service.
Dec 03 00:30:58 compute-0 nova_compute[187243]: 2025-12-03 00:30:58.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:30:59 compute-0 podman[224808]: 2025-12-03 00:30:59.504974882 +0000 UTC m=+0.081334152 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 03 00:30:59 compute-0 podman[224809]: 2025-12-03 00:30:59.517289136 +0000 UTC m=+0.093329868 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 00:30:59 compute-0 podman[197600]: time="2025-12-03T00:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:30:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:30:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Dec 03 00:31:00 compute-0 nova_compute[187243]: 2025-12-03 00:31:00.593 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:31:00.741 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:31:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:31:00.741 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:31:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:31:00.741 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:31:01 compute-0 nova_compute[187243]: 2025-12-03 00:31:01.110 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:01 compute-0 openstack_network_exporter[199746]: ERROR   00:31:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:31:01 compute-0 openstack_network_exporter[199746]: ERROR   00:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:31:01 compute-0 openstack_network_exporter[199746]: ERROR   00:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:31:01 compute-0 openstack_network_exporter[199746]: ERROR   00:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:31:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:31:01 compute-0 openstack_network_exporter[199746]: ERROR   00:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:31:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:31:02 compute-0 nova_compute[187243]: 2025-12-03 00:31:02.442 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:02 compute-0 nova_compute[187243]: 2025-12-03 00:31:02.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:02 compute-0 nova_compute[187243]: 2025-12-03 00:31:02.591 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:31:02 compute-0 sshd-session[224720]: Received disconnect from 45.78.219.95 port 50454:11: Bye Bye [preauth]
Dec 03 00:31:02 compute-0 sshd-session[224720]: Disconnected from authenticating user root 45.78.219.95 port 50454 [preauth]
Dec 03 00:31:03 compute-0 nova_compute[187243]: 2025-12-03 00:31:03.587 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:04 compute-0 ovs-appctl[225976]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 03 00:31:04 compute-0 ovs-appctl[225981]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 03 00:31:04 compute-0 ovs-appctl[225988]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 03 00:31:05 compute-0 nova_compute[187243]: 2025-12-03 00:31:05.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:06 compute-0 nova_compute[187243]: 2025-12-03 00:31:06.148 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:07 compute-0 nova_compute[187243]: 2025-12-03 00:31:07.044 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:31:07 compute-0 nova_compute[187243]: 2025-12-03 00:31:07.044 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:31:07 compute-0 nova_compute[187243]: 2025-12-03 00:31:07.044 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:31:07 compute-0 nova_compute[187243]: 2025-12-03 00:31:07.044 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:31:07 compute-0 nova_compute[187243]: 2025-12-03 00:31:07.164 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:31:07 compute-0 nova_compute[187243]: 2025-12-03 00:31:07.165 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:31:07 compute-0 nova_compute[187243]: 2025-12-03 00:31:07.186 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:31:07 compute-0 nova_compute[187243]: 2025-12-03 00:31:07.187 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5650MB free_disk=72.79410171508789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:31:07 compute-0 nova_compute[187243]: 2025-12-03 00:31:07.187 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:31:07 compute-0 nova_compute[187243]: 2025-12-03 00:31:07.188 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:31:07 compute-0 nova_compute[187243]: 2025-12-03 00:31:07.444 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:08 compute-0 nova_compute[187243]: 2025-12-03 00:31:08.592 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:31:08 compute-0 nova_compute[187243]: 2025-12-03 00:31:08.592 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:31:07 up  1:39,  0 user,  load average: 0.61, 0.24, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:31:08 compute-0 nova_compute[187243]: 2025-12-03 00:31:08.617 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:31:09 compute-0 nova_compute[187243]: 2025-12-03 00:31:09.128 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:31:09 compute-0 nova_compute[187243]: 2025-12-03 00:31:09.638 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:31:09 compute-0 nova_compute[187243]: 2025-12-03 00:31:09.639 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.452s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:31:11 compute-0 nova_compute[187243]: 2025-12-03 00:31:11.148 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:11 compute-0 virtqemud[186944]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 03 00:31:12 compute-0 nova_compute[187243]: 2025-12-03 00:31:12.446 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:13 compute-0 systemd[1]: Starting Time & Date Service...
Dec 03 00:31:13 compute-0 systemd[1]: Started Time & Date Service.
Dec 03 00:31:13 compute-0 nova_compute[187243]: 2025-12-03 00:31:13.639 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:13 compute-0 nova_compute[187243]: 2025-12-03 00:31:13.641 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:16 compute-0 podman[227256]: 2025-12-03 00:31:16.113399583 +0000 UTC m=+0.060945458 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, release=1755695350, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 03 00:31:16 compute-0 podman[227255]: 2025-12-03 00:31:16.113496015 +0000 UTC m=+0.061699636 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:31:16 compute-0 nova_compute[187243]: 2025-12-03 00:31:16.150 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:17 compute-0 nova_compute[187243]: 2025-12-03 00:31:17.449 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:19 compute-0 nova_compute[187243]: 2025-12-03 00:31:19.589 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:21 compute-0 nova_compute[187243]: 2025-12-03 00:31:21.151 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:22 compute-0 nova_compute[187243]: 2025-12-03 00:31:22.452 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:25 compute-0 podman[227295]: 2025-12-03 00:31:25.071288738 +0000 UTC m=+0.074100692 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:31:26 compute-0 nova_compute[187243]: 2025-12-03 00:31:26.153 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:27 compute-0 nova_compute[187243]: 2025-12-03 00:31:27.456 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:29 compute-0 podman[227321]: 2025-12-03 00:31:29.64931795 +0000 UTC m=+0.046505911 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 03 00:31:29 compute-0 podman[227322]: 2025-12-03 00:31:29.685367841 +0000 UTC m=+0.079742682 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Dec 03 00:31:29 compute-0 podman[197600]: time="2025-12-03T00:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:31:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:31:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2606 "" "Go-http-client/1.1"
Dec 03 00:31:31 compute-0 nova_compute[187243]: 2025-12-03 00:31:31.155 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:31 compute-0 openstack_network_exporter[199746]: ERROR   00:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:31:31 compute-0 openstack_network_exporter[199746]: ERROR   00:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:31:31 compute-0 openstack_network_exporter[199746]: ERROR   00:31:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:31:31 compute-0 openstack_network_exporter[199746]: ERROR   00:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:31:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:31:31 compute-0 openstack_network_exporter[199746]: ERROR   00:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:31:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:31:32 compute-0 nova_compute[187243]: 2025-12-03 00:31:32.458 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:34 compute-0 sudo[223905]: pam_unix(sudo:session): session closed for user root
Dec 03 00:31:34 compute-0 sshd-session[223904]: Received disconnect from 192.168.122.10 port 59868:11: disconnected by user
Dec 03 00:31:34 compute-0 sshd-session[223904]: Disconnected from user zuul 192.168.122.10 port 59868
Dec 03 00:31:34 compute-0 sshd-session[223901]: pam_unix(sshd:session): session closed for user zuul
Dec 03 00:31:34 compute-0 systemd-logind[795]: Session 28 logged out. Waiting for processes to exit.
Dec 03 00:31:34 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Dec 03 00:31:34 compute-0 systemd[1]: session-28.scope: Consumed 1min 15.033s CPU time, 519.5M memory peak, read 102.5M from disk, written 171.7M to disk.
Dec 03 00:31:34 compute-0 systemd-logind[795]: Removed session 28.
Dec 03 00:31:34 compute-0 sshd-session[227364]: Accepted publickey for zuul from 192.168.122.10 port 42694 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 03 00:31:34 compute-0 systemd-logind[795]: New session 29 of user zuul.
Dec 03 00:31:34 compute-0 systemd[1]: Started Session 29 of User zuul.
Dec 03 00:31:34 compute-0 sshd-session[227364]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 00:31:34 compute-0 sudo[227368]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-12-03-hkjfncf.tar.xz
Dec 03 00:31:34 compute-0 sudo[227368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 00:31:35 compute-0 sudo[227368]: pam_unix(sudo:session): session closed for user root
Dec 03 00:31:35 compute-0 sshd-session[227367]: Received disconnect from 192.168.122.10 port 42694:11: disconnected by user
Dec 03 00:31:35 compute-0 sshd-session[227367]: Disconnected from user zuul 192.168.122.10 port 42694
Dec 03 00:31:35 compute-0 sshd-session[227364]: pam_unix(sshd:session): session closed for user zuul
Dec 03 00:31:35 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Dec 03 00:31:35 compute-0 systemd-logind[795]: Session 29 logged out. Waiting for processes to exit.
Dec 03 00:31:35 compute-0 systemd-logind[795]: Removed session 29.
Dec 03 00:31:35 compute-0 sshd-session[227393]: Accepted publickey for zuul from 192.168.122.10 port 42700 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 03 00:31:35 compute-0 systemd-logind[795]: New session 30 of user zuul.
Dec 03 00:31:35 compute-0 systemd[1]: Started Session 30 of User zuul.
Dec 03 00:31:35 compute-0 sshd-session[227393]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 00:31:35 compute-0 sudo[227397]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Dec 03 00:31:35 compute-0 sudo[227397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 00:31:35 compute-0 sudo[227397]: pam_unix(sudo:session): session closed for user root
Dec 03 00:31:35 compute-0 sshd-session[227396]: Received disconnect from 192.168.122.10 port 42700:11: disconnected by user
Dec 03 00:31:35 compute-0 sshd-session[227396]: Disconnected from user zuul 192.168.122.10 port 42700
Dec 03 00:31:35 compute-0 sshd-session[227393]: pam_unix(sshd:session): session closed for user zuul
Dec 03 00:31:35 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Dec 03 00:31:35 compute-0 systemd-logind[795]: Session 30 logged out. Waiting for processes to exit.
Dec 03 00:31:35 compute-0 systemd-logind[795]: Removed session 30.
Dec 03 00:31:36 compute-0 nova_compute[187243]: 2025-12-03 00:31:36.156 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:37 compute-0 nova_compute[187243]: 2025-12-03 00:31:37.460 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:39 compute-0 sshd-session[227422]: banner exchange: Connection from 111.61.105.100 port 58674: invalid format
Dec 03 00:31:41 compute-0 nova_compute[187243]: 2025-12-03 00:31:41.159 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:42 compute-0 nova_compute[187243]: 2025-12-03 00:31:42.462 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:43 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 03 00:31:43 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 03 00:31:43 compute-0 sshd-session[227423]: Invalid user blue from 61.220.235.10 port 48102
Dec 03 00:31:43 compute-0 sshd-session[227423]: Received disconnect from 61.220.235.10 port 48102:11: Bye Bye [preauth]
Dec 03 00:31:43 compute-0 sshd-session[227423]: Disconnected from invalid user blue 61.220.235.10 port 48102 [preauth]
Dec 03 00:31:46 compute-0 nova_compute[187243]: 2025-12-03 00:31:46.161 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:47 compute-0 podman[227431]: 2025-12-03 00:31:47.096542031 +0000 UTC m=+0.053583565 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 03 00:31:47 compute-0 podman[227432]: 2025-12-03 00:31:47.098724545 +0000 UTC m=+0.055827061 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=)
Dec 03 00:31:47 compute-0 nova_compute[187243]: 2025-12-03 00:31:47.508 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:51 compute-0 nova_compute[187243]: 2025-12-03 00:31:51.162 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:52 compute-0 nova_compute[187243]: 2025-12-03 00:31:52.548 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:56 compute-0 podman[227471]: 2025-12-03 00:31:56.121510863 +0000 UTC m=+0.080229773 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:31:56 compute-0 nova_compute[187243]: 2025-12-03 00:31:56.164 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:56 compute-0 nova_compute[187243]: 2025-12-03 00:31:56.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:57 compute-0 nova_compute[187243]: 2025-12-03 00:31:57.550 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:31:58 compute-0 nova_compute[187243]: 2025-12-03 00:31:58.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:31:59 compute-0 podman[197600]: time="2025-12-03T00:31:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:31:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:31:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:31:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:31:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2606 "" "Go-http-client/1.1"
Dec 03 00:32:00 compute-0 podman[227497]: 2025-12-03 00:32:00.099319741 +0000 UTC m=+0.058925197 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 03 00:32:00 compute-0 podman[227498]: 2025-12-03 00:32:00.132180523 +0000 UTC m=+0.091415470 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 03 00:32:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:32:00.742 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:32:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:32:00.742 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:32:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:32:00.742 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:32:01 compute-0 nova_compute[187243]: 2025-12-03 00:32:01.164 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:01 compute-0 openstack_network_exporter[199746]: ERROR   00:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:32:01 compute-0 openstack_network_exporter[199746]: ERROR   00:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:32:01 compute-0 openstack_network_exporter[199746]: ERROR   00:32:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:32:01 compute-0 openstack_network_exporter[199746]: ERROR   00:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:32:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:32:01 compute-0 openstack_network_exporter[199746]: ERROR   00:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:32:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:32:01 compute-0 nova_compute[187243]: 2025-12-03 00:32:01.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:02 compute-0 nova_compute[187243]: 2025-12-03 00:32:02.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:02 compute-0 nova_compute[187243]: 2025-12-03 00:32:02.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:32:02 compute-0 nova_compute[187243]: 2025-12-03 00:32:02.610 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:05 compute-0 nova_compute[187243]: 2025-12-03 00:32:05.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:06 compute-0 nova_compute[187243]: 2025-12-03 00:32:06.167 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:06 compute-0 nova_compute[187243]: 2025-12-03 00:32:06.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:07 compute-0 nova_compute[187243]: 2025-12-03 00:32:07.110 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:32:07 compute-0 nova_compute[187243]: 2025-12-03 00:32:07.110 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:32:07 compute-0 nova_compute[187243]: 2025-12-03 00:32:07.111 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:32:07 compute-0 nova_compute[187243]: 2025-12-03 00:32:07.111 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:32:07 compute-0 nova_compute[187243]: 2025-12-03 00:32:07.240 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:32:07 compute-0 nova_compute[187243]: 2025-12-03 00:32:07.241 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:32:07 compute-0 nova_compute[187243]: 2025-12-03 00:32:07.258 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:32:07 compute-0 nova_compute[187243]: 2025-12-03 00:32:07.258 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5777MB free_disk=73.1615104675293GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:32:07 compute-0 nova_compute[187243]: 2025-12-03 00:32:07.259 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:32:07 compute-0 nova_compute[187243]: 2025-12-03 00:32:07.259 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:32:07 compute-0 nova_compute[187243]: 2025-12-03 00:32:07.613 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:08 compute-0 nova_compute[187243]: 2025-12-03 00:32:08.351 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:32:08 compute-0 nova_compute[187243]: 2025-12-03 00:32:08.351 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:32:07 up  1:40,  0 user,  load average: 0.44, 0.28, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:32:08 compute-0 nova_compute[187243]: 2025-12-03 00:32:08.471 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:32:09 compute-0 nova_compute[187243]: 2025-12-03 00:32:09.069 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:32:09 compute-0 nova_compute[187243]: 2025-12-03 00:32:09.579 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:32:09 compute-0 nova_compute[187243]: 2025-12-03 00:32:09.579 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.320s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:32:11 compute-0 nova_compute[187243]: 2025-12-03 00:32:11.168 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:12 compute-0 nova_compute[187243]: 2025-12-03 00:32:12.579 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:12 compute-0 nova_compute[187243]: 2025-12-03 00:32:12.580 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:12 compute-0 nova_compute[187243]: 2025-12-03 00:32:12.676 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:16 compute-0 nova_compute[187243]: 2025-12-03 00:32:16.170 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:17 compute-0 nova_compute[187243]: 2025-12-03 00:32:17.729 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:18 compute-0 podman[227548]: 2025-12-03 00:32:18.097004137 +0000 UTC m=+0.049011982 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:32:18 compute-0 podman[227547]: 2025-12-03 00:32:18.097149601 +0000 UTC m=+0.051914324 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd)
Dec 03 00:32:21 compute-0 nova_compute[187243]: 2025-12-03 00:32:21.173 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:22 compute-0 sshd-session[227589]: Received disconnect from 101.47.140.127 port 36926:11: Bye Bye [preauth]
Dec 03 00:32:22 compute-0 sshd-session[227589]: Disconnected from authenticating user root 101.47.140.127 port 36926 [preauth]
Dec 03 00:32:22 compute-0 nova_compute[187243]: 2025-12-03 00:32:22.732 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:26 compute-0 nova_compute[187243]: 2025-12-03 00:32:26.174 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:26 compute-0 podman[227591]: 2025-12-03 00:32:26.255295083 +0000 UTC m=+0.049361741 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:32:27 compute-0 nova_compute[187243]: 2025-12-03 00:32:27.735 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:29 compute-0 podman[197600]: time="2025-12-03T00:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:32:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:32:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2606 "" "Go-http-client/1.1"
Dec 03 00:32:31 compute-0 podman[227615]: 2025-12-03 00:32:31.092525969 +0000 UTC m=+0.053830950 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:32:31 compute-0 podman[227616]: 2025-12-03 00:32:31.120546691 +0000 UTC m=+0.076531772 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Dec 03 00:32:31 compute-0 nova_compute[187243]: 2025-12-03 00:32:31.176 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:31 compute-0 openstack_network_exporter[199746]: ERROR   00:32:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:32:31 compute-0 openstack_network_exporter[199746]: ERROR   00:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:32:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:32:31 compute-0 openstack_network_exporter[199746]: ERROR   00:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:32:31 compute-0 openstack_network_exporter[199746]: ERROR   00:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:32:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:32:31 compute-0 openstack_network_exporter[199746]: ERROR   00:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:32:32 compute-0 nova_compute[187243]: 2025-12-03 00:32:32.739 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:36 compute-0 nova_compute[187243]: 2025-12-03 00:32:36.179 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:37 compute-0 nova_compute[187243]: 2025-12-03 00:32:37.787 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:41 compute-0 nova_compute[187243]: 2025-12-03 00:32:41.180 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:42 compute-0 nova_compute[187243]: 2025-12-03 00:32:42.790 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:46 compute-0 nova_compute[187243]: 2025-12-03 00:32:46.182 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:46 compute-0 sshd-session[227661]: Invalid user mika from 45.78.222.160 port 44798
Dec 03 00:32:46 compute-0 sshd-session[227661]: Received disconnect from 45.78.222.160 port 44798:11: Bye Bye [preauth]
Dec 03 00:32:46 compute-0 sshd-session[227661]: Disconnected from invalid user mika 45.78.222.160 port 44798 [preauth]
Dec 03 00:32:47 compute-0 nova_compute[187243]: 2025-12-03 00:32:47.836 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:49 compute-0 podman[227665]: 2025-12-03 00:32:49.09699615 +0000 UTC m=+0.059045331 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 03 00:32:49 compute-0 podman[227666]: 2025-12-03 00:32:49.125430193 +0000 UTC m=+0.079281931 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm)
Dec 03 00:32:49 compute-0 sshd-session[227663]: Invalid user NL5xUDpV2xRa from 111.61.105.100 port 39412
Dec 03 00:32:49 compute-0 sshd-session[227663]: fatal: userauth_pubkey: parse publickey packet: incomplete message [preauth]
Dec 03 00:32:51 compute-0 nova_compute[187243]: 2025-12-03 00:32:51.215 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:52 compute-0 nova_compute[187243]: 2025-12-03 00:32:52.839 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:56 compute-0 nova_compute[187243]: 2025-12-03 00:32:56.216 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:56 compute-0 nova_compute[187243]: 2025-12-03 00:32:56.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:57 compute-0 podman[227706]: 2025-12-03 00:32:57.120502132 +0000 UTC m=+0.082906280 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:32:57 compute-0 nova_compute[187243]: 2025-12-03 00:32:57.842 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:32:59 compute-0 nova_compute[187243]: 2025-12-03 00:32:59.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:32:59 compute-0 podman[197600]: time="2025-12-03T00:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:32:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:32:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2607 "" "Go-http-client/1.1"
Dec 03 00:33:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:33:00.743 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:33:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:33:00.744 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:33:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:33:00.744 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:33:01 compute-0 nova_compute[187243]: 2025-12-03 00:33:01.265 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:01 compute-0 openstack_network_exporter[199746]: ERROR   00:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:33:01 compute-0 openstack_network_exporter[199746]: ERROR   00:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:33:01 compute-0 openstack_network_exporter[199746]: ERROR   00:33:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:33:01 compute-0 openstack_network_exporter[199746]: ERROR   00:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:33:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:33:01 compute-0 openstack_network_exporter[199746]: ERROR   00:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:33:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:33:01 compute-0 nova_compute[187243]: 2025-12-03 00:33:01.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:02 compute-0 podman[227732]: 2025-12-03 00:33:02.081322924 +0000 UTC m=+0.045046894 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 03 00:33:02 compute-0 podman[227733]: 2025-12-03 00:33:02.114865653 +0000 UTC m=+0.074217805 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 00:33:02 compute-0 nova_compute[187243]: 2025-12-03 00:33:02.844 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:03 compute-0 nova_compute[187243]: 2025-12-03 00:33:03.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:03 compute-0 nova_compute[187243]: 2025-12-03 00:33:03.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:33:05 compute-0 nova_compute[187243]: 2025-12-03 00:33:05.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:06 compute-0 nova_compute[187243]: 2025-12-03 00:33:06.304 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:06 compute-0 nova_compute[187243]: 2025-12-03 00:33:06.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:07 compute-0 nova_compute[187243]: 2025-12-03 00:33:07.108 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:33:07 compute-0 nova_compute[187243]: 2025-12-03 00:33:07.109 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:33:07 compute-0 nova_compute[187243]: 2025-12-03 00:33:07.109 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:33:07 compute-0 nova_compute[187243]: 2025-12-03 00:33:07.109 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:33:07 compute-0 nova_compute[187243]: 2025-12-03 00:33:07.221 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:33:07 compute-0 nova_compute[187243]: 2025-12-03 00:33:07.222 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:33:07 compute-0 nova_compute[187243]: 2025-12-03 00:33:07.237 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:33:07 compute-0 nova_compute[187243]: 2025-12-03 00:33:07.237 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5813MB free_disk=73.16093444824219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:33:07 compute-0 nova_compute[187243]: 2025-12-03 00:33:07.238 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:33:07 compute-0 nova_compute[187243]: 2025-12-03 00:33:07.238 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:33:07 compute-0 nova_compute[187243]: 2025-12-03 00:33:07.847 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:08 compute-0 nova_compute[187243]: 2025-12-03 00:33:08.300 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:33:08 compute-0 nova_compute[187243]: 2025-12-03 00:33:08.300 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:33:07 up  1:41,  0 user,  load average: 0.16, 0.23, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:33:08 compute-0 nova_compute[187243]: 2025-12-03 00:33:08.323 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing inventories for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:33:08 compute-0 nova_compute[187243]: 2025-12-03 00:33:08.347 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating ProviderTree inventory for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:33:08 compute-0 nova_compute[187243]: 2025-12-03 00:33:08.348 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:33:08 compute-0 nova_compute[187243]: 2025-12-03 00:33:08.362 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing aggregate associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:33:08 compute-0 nova_compute[187243]: 2025-12-03 00:33:08.426 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing trait associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_SOUND_MODEL_AC97,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOUND_MODEL_ICH9,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:33:08 compute-0 nova_compute[187243]: 2025-12-03 00:33:08.454 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:33:08 compute-0 nova_compute[187243]: 2025-12-03 00:33:08.963 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:33:09 compute-0 nova_compute[187243]: 2025-12-03 00:33:09.471 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:33:09 compute-0 nova_compute[187243]: 2025-12-03 00:33:09.471 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.233s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:33:09 compute-0 nova_compute[187243]: 2025-12-03 00:33:09.472 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:09 compute-0 nova_compute[187243]: 2025-12-03 00:33:09.472 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:33:11 compute-0 nova_compute[187243]: 2025-12-03 00:33:11.306 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:12 compute-0 nova_compute[187243]: 2025-12-03 00:33:12.850 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:12 compute-0 nova_compute[187243]: 2025-12-03 00:33:12.977 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:13 compute-0 nova_compute[187243]: 2025-12-03 00:33:13.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:15 compute-0 nova_compute[187243]: 2025-12-03 00:33:15.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:15 compute-0 nova_compute[187243]: 2025-12-03 00:33:15.593 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:33:16 compute-0 nova_compute[187243]: 2025-12-03 00:33:16.099 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:33:16 compute-0 nova_compute[187243]: 2025-12-03 00:33:16.361 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:16 compute-0 nova_compute[187243]: 2025-12-03 00:33:16.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:16 compute-0 sshd-session[227782]: Invalid user dangulo from 61.220.235.10 port 47260
Dec 03 00:33:16 compute-0 sshd-session[227782]: Received disconnect from 61.220.235.10 port 47260:11: Bye Bye [preauth]
Dec 03 00:33:16 compute-0 sshd-session[227782]: Disconnected from invalid user dangulo 61.220.235.10 port 47260 [preauth]
Dec 03 00:33:17 compute-0 sshd-session[227779]: Received disconnect from 45.78.219.213 port 36116:11: Bye Bye [preauth]
Dec 03 00:33:17 compute-0 sshd-session[227779]: Disconnected from authenticating user root 45.78.219.213 port 36116 [preauth]
Dec 03 00:33:17 compute-0 nova_compute[187243]: 2025-12-03 00:33:17.887 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:20 compute-0 nova_compute[187243]: 2025-12-03 00:33:20.093 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:20 compute-0 podman[227784]: 2025-12-03 00:33:20.096179614 +0000 UTC m=+0.057684946 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:33:20 compute-0 podman[227785]: 2025-12-03 00:33:20.101511606 +0000 UTC m=+0.059664355 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, version=9.6, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:33:21 compute-0 nova_compute[187243]: 2025-12-03 00:33:21.363 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:22 compute-0 nova_compute[187243]: 2025-12-03 00:33:22.890 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:26 compute-0 nova_compute[187243]: 2025-12-03 00:33:26.380 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:27 compute-0 nova_compute[187243]: 2025-12-03 00:33:27.893 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:28 compute-0 podman[227824]: 2025-12-03 00:33:28.118337204 +0000 UTC m=+0.073623151 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:33:29 compute-0 podman[197600]: time="2025-12-03T00:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:33:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:33:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2604 "" "Go-http-client/1.1"
Dec 03 00:33:31 compute-0 nova_compute[187243]: 2025-12-03 00:33:31.381 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:31 compute-0 openstack_network_exporter[199746]: ERROR   00:33:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:33:31 compute-0 openstack_network_exporter[199746]: ERROR   00:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:33:31 compute-0 openstack_network_exporter[199746]: ERROR   00:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:33:31 compute-0 openstack_network_exporter[199746]: ERROR   00:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:33:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:33:31 compute-0 openstack_network_exporter[199746]: ERROR   00:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:33:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:33:32 compute-0 nova_compute[187243]: 2025-12-03 00:33:32.950 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:33 compute-0 podman[227848]: 2025-12-03 00:33:33.084233899 +0000 UTC m=+0.041900946 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:33:33 compute-0 podman[227849]: 2025-12-03 00:33:33.140702755 +0000 UTC m=+0.095834149 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4)
Dec 03 00:33:36 compute-0 nova_compute[187243]: 2025-12-03 00:33:36.382 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:38 compute-0 nova_compute[187243]: 2025-12-03 00:33:38.001 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:41 compute-0 nova_compute[187243]: 2025-12-03 00:33:41.385 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:43 compute-0 nova_compute[187243]: 2025-12-03 00:33:43.004 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:46 compute-0 sshd[128750]: Timeout before authentication for connection from 111.61.105.100 to 38.102.83.77, pid = 227429
Dec 03 00:33:46 compute-0 nova_compute[187243]: 2025-12-03 00:33:46.387 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:47 compute-0 sshd-session[227890]: Invalid user username from 45.78.219.95 port 57292
Dec 03 00:33:48 compute-0 nova_compute[187243]: 2025-12-03 00:33:48.041 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:49 compute-0 sshd-session[227890]: Received disconnect from 45.78.219.95 port 57292:11: Bye Bye [preauth]
Dec 03 00:33:49 compute-0 sshd-session[227890]: Disconnected from invalid user username 45.78.219.95 port 57292 [preauth]
Dec 03 00:33:51 compute-0 podman[227893]: 2025-12-03 00:33:51.106459843 +0000 UTC m=+0.057221678 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:33:51 compute-0 podman[227892]: 2025-12-03 00:33:51.136492927 +0000 UTC m=+0.088258087 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd)
Dec 03 00:33:51 compute-0 nova_compute[187243]: 2025-12-03 00:33:51.416 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:53 compute-0 nova_compute[187243]: 2025-12-03 00:33:53.044 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:56 compute-0 nova_compute[187243]: 2025-12-03 00:33:56.418 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:58 compute-0 nova_compute[187243]: 2025-12-03 00:33:58.047 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:33:58 compute-0 nova_compute[187243]: 2025-12-03 00:33:58.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:33:59 compute-0 podman[227934]: 2025-12-03 00:33:59.086361787 +0000 UTC m=+0.044793590 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:33:59 compute-0 podman[197600]: time="2025-12-03T00:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:33:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:33:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Dec 03 00:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:34:00.744 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:34:00.745 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:34:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:34:00.745 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:34:01 compute-0 openstack_network_exporter[199746]: ERROR   00:34:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:34:01 compute-0 openstack_network_exporter[199746]: ERROR   00:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:34:01 compute-0 openstack_network_exporter[199746]: ERROR   00:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:34:01 compute-0 openstack_network_exporter[199746]: ERROR   00:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:34:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:34:01 compute-0 openstack_network_exporter[199746]: ERROR   00:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:34:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:34:01 compute-0 nova_compute[187243]: 2025-12-03 00:34:01.421 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:01 compute-0 nova_compute[187243]: 2025-12-03 00:34:01.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:01 compute-0 nova_compute[187243]: 2025-12-03 00:34:01.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:03 compute-0 nova_compute[187243]: 2025-12-03 00:34:03.050 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:04 compute-0 podman[227959]: 2025-12-03 00:34:04.096643032 +0000 UTC m=+0.055423264 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:34:04 compute-0 podman[227960]: 2025-12-03 00:34:04.116026041 +0000 UTC m=+0.074254660 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 03 00:34:04 compute-0 nova_compute[187243]: 2025-12-03 00:34:04.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:04 compute-0 nova_compute[187243]: 2025-12-03 00:34:04.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:34:06 compute-0 nova_compute[187243]: 2025-12-03 00:34:06.421 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:07 compute-0 nova_compute[187243]: 2025-12-03 00:34:07.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:08 compute-0 nova_compute[187243]: 2025-12-03 00:34:08.053 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:08 compute-0 nova_compute[187243]: 2025-12-03 00:34:08.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:09 compute-0 nova_compute[187243]: 2025-12-03 00:34:09.101 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:34:09 compute-0 nova_compute[187243]: 2025-12-03 00:34:09.102 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:34:09 compute-0 nova_compute[187243]: 2025-12-03 00:34:09.102 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:34:09 compute-0 nova_compute[187243]: 2025-12-03 00:34:09.102 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:34:09 compute-0 nova_compute[187243]: 2025-12-03 00:34:09.227 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:34:09 compute-0 nova_compute[187243]: 2025-12-03 00:34:09.228 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:34:09 compute-0 nova_compute[187243]: 2025-12-03 00:34:09.243 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:34:09 compute-0 nova_compute[187243]: 2025-12-03 00:34:09.244 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5802MB free_disk=73.16093444824219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:34:09 compute-0 nova_compute[187243]: 2025-12-03 00:34:09.244 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:34:09 compute-0 nova_compute[187243]: 2025-12-03 00:34:09.244 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:34:10 compute-0 nova_compute[187243]: 2025-12-03 00:34:10.285 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:34:10 compute-0 nova_compute[187243]: 2025-12-03 00:34:10.286 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:34:09 up  1:42,  0 user,  load average: 0.13, 0.20, 0.23\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:34:10 compute-0 nova_compute[187243]: 2025-12-03 00:34:10.358 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:34:10 compute-0 nova_compute[187243]: 2025-12-03 00:34:10.868 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:34:11 compute-0 nova_compute[187243]: 2025-12-03 00:34:11.381 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:34:11 compute-0 nova_compute[187243]: 2025-12-03 00:34:11.381 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.137s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:34:11 compute-0 nova_compute[187243]: 2025-12-03 00:34:11.475 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:13 compute-0 nova_compute[187243]: 2025-12-03 00:34:13.057 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:14 compute-0 nova_compute[187243]: 2025-12-03 00:34:14.381 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:14 compute-0 nova_compute[187243]: 2025-12-03 00:34:14.382 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:14 compute-0 nova_compute[187243]: 2025-12-03 00:34:14.382 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:16 compute-0 nova_compute[187243]: 2025-12-03 00:34:16.475 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:18 compute-0 nova_compute[187243]: 2025-12-03 00:34:18.060 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:19 compute-0 sshd[128750]: Timeout before authentication for connection from 111.61.105.100 to 38.102.83.77, pid = 227545
Dec 03 00:34:21 compute-0 nova_compute[187243]: 2025-12-03 00:34:21.477 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:22 compute-0 podman[228003]: 2025-12-03 00:34:22.087355303 +0000 UTC m=+0.046632126 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 03 00:34:22 compute-0 podman[228004]: 2025-12-03 00:34:22.123303754 +0000 UTC m=+0.079247275 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:34:23 compute-0 nova_compute[187243]: 2025-12-03 00:34:23.063 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:26 compute-0 nova_compute[187243]: 2025-12-03 00:34:26.480 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:28 compute-0 nova_compute[187243]: 2025-12-03 00:34:28.066 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:29 compute-0 podman[197600]: time="2025-12-03T00:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:34:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:34:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2604 "" "Go-http-client/1.1"
Dec 03 00:34:30 compute-0 podman[228043]: 2025-12-03 00:34:30.086323269 +0000 UTC m=+0.044481543 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:34:31 compute-0 openstack_network_exporter[199746]: ERROR   00:34:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:34:31 compute-0 openstack_network_exporter[199746]: ERROR   00:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:34:31 compute-0 openstack_network_exporter[199746]: ERROR   00:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:34:31 compute-0 openstack_network_exporter[199746]: ERROR   00:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:34:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:34:31 compute-0 openstack_network_exporter[199746]: ERROR   00:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:34:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:34:31 compute-0 nova_compute[187243]: 2025-12-03 00:34:31.479 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:33 compute-0 nova_compute[187243]: 2025-12-03 00:34:33.068 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:35 compute-0 podman[228067]: 2025-12-03 00:34:35.093511426 +0000 UTC m=+0.049934658 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 03 00:34:35 compute-0 podman[228068]: 2025-12-03 00:34:35.172314398 +0000 UTC m=+0.125201942 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 03 00:34:36 compute-0 nova_compute[187243]: 2025-12-03 00:34:36.482 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:38 compute-0 nova_compute[187243]: 2025-12-03 00:34:38.071 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:41 compute-0 nova_compute[187243]: 2025-12-03 00:34:41.482 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:43 compute-0 nova_compute[187243]: 2025-12-03 00:34:43.132 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:46 compute-0 nova_compute[187243]: 2025-12-03 00:34:46.518 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:48 compute-0 nova_compute[187243]: 2025-12-03 00:34:48.134 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:51 compute-0 nova_compute[187243]: 2025-12-03 00:34:51.533 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:53 compute-0 podman[228113]: 2025-12-03 00:34:53.110357234 +0000 UTC m=+0.064989341 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Dec 03 00:34:53 compute-0 podman[228114]: 2025-12-03 00:34:53.135516437 +0000 UTC m=+0.080560076 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 03 00:34:53 compute-0 nova_compute[187243]: 2025-12-03 00:34:53.137 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:56 compute-0 nova_compute[187243]: 2025-12-03 00:34:56.534 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:58 compute-0 nova_compute[187243]: 2025-12-03 00:34:58.139 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:34:59 compute-0 nova_compute[187243]: 2025-12-03 00:34:59.103 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:34:59 compute-0 podman[197600]: time="2025-12-03T00:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:34:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:34:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Dec 03 00:35:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:35:00.746 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:35:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:35:00.746 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:35:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:35:00.746 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:35:01 compute-0 podman[228153]: 2025-12-03 00:35:01.091791516 +0000 UTC m=+0.052435310 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:35:01 compute-0 openstack_network_exporter[199746]: ERROR   00:35:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:35:01 compute-0 openstack_network_exporter[199746]: ERROR   00:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:35:01 compute-0 openstack_network_exporter[199746]: ERROR   00:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:35:01 compute-0 openstack_network_exporter[199746]: ERROR   00:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:35:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:35:01 compute-0 openstack_network_exporter[199746]: ERROR   00:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:35:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:35:01 compute-0 nova_compute[187243]: 2025-12-03 00:35:01.535 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:02 compute-0 nova_compute[187243]: 2025-12-03 00:35:02.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:02 compute-0 nova_compute[187243]: 2025-12-03 00:35:02.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:03 compute-0 nova_compute[187243]: 2025-12-03 00:35:03.142 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:04 compute-0 nova_compute[187243]: 2025-12-03 00:35:04.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:04 compute-0 nova_compute[187243]: 2025-12-03 00:35:04.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:35:04 compute-0 sshd-session[228112]: Connection closed by 101.47.140.127 port 40620 [preauth]
Dec 03 00:35:05 compute-0 sshd-session[228178]: Invalid user bodega from 45.78.222.160 port 48624
Dec 03 00:35:05 compute-0 podman[228181]: 2025-12-03 00:35:05.542923787 +0000 UTC m=+0.074699332 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 03 00:35:05 compute-0 podman[228182]: 2025-12-03 00:35:05.548752481 +0000 UTC m=+0.077810899 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Dec 03 00:35:05 compute-0 sshd-session[228178]: Received disconnect from 45.78.222.160 port 48624:11: Bye Bye [preauth]
Dec 03 00:35:05 compute-0 sshd-session[228178]: Disconnected from invalid user bodega 45.78.222.160 port 48624 [preauth]
Dec 03 00:35:06 compute-0 nova_compute[187243]: 2025-12-03 00:35:06.537 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:07 compute-0 nova_compute[187243]: 2025-12-03 00:35:07.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:07 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:35:07.741 104379 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '7a:9d:37', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'b2:5b:26:46:39:9e'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 03 00:35:07 compute-0 nova_compute[187243]: 2025-12-03 00:35:07.741 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:07 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:35:07.742 104379 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 03 00:35:08 compute-0 nova_compute[187243]: 2025-12-03 00:35:08.144 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:09 compute-0 nova_compute[187243]: 2025-12-03 00:35:09.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:10 compute-0 nova_compute[187243]: 2025-12-03 00:35:10.106 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:35:10 compute-0 nova_compute[187243]: 2025-12-03 00:35:10.106 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:35:10 compute-0 nova_compute[187243]: 2025-12-03 00:35:10.106 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:35:10 compute-0 nova_compute[187243]: 2025-12-03 00:35:10.106 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:35:10 compute-0 nova_compute[187243]: 2025-12-03 00:35:10.252 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:35:10 compute-0 nova_compute[187243]: 2025-12-03 00:35:10.253 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:35:10 compute-0 nova_compute[187243]: 2025-12-03 00:35:10.273 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:35:10 compute-0 nova_compute[187243]: 2025-12-03 00:35:10.273 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5803MB free_disk=73.16093444824219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:35:10 compute-0 nova_compute[187243]: 2025-12-03 00:35:10.274 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:35:10 compute-0 nova_compute[187243]: 2025-12-03 00:35:10.274 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:35:10 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:35:10.743 104379 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=83290d9e-bd8f-4c21-b54d-356f7c3da39f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 00:35:11 compute-0 nova_compute[187243]: 2025-12-03 00:35:11.361 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:35:11 compute-0 nova_compute[187243]: 2025-12-03 00:35:11.362 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:35:10 up  1:43,  0 user,  load average: 0.05, 0.16, 0.21\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:35:11 compute-0 nova_compute[187243]: 2025-12-03 00:35:11.380 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:35:11 compute-0 nova_compute[187243]: 2025-12-03 00:35:11.538 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:11 compute-0 nova_compute[187243]: 2025-12-03 00:35:11.889 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:35:12 compute-0 nova_compute[187243]: 2025-12-03 00:35:12.400 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:35:12 compute-0 nova_compute[187243]: 2025-12-03 00:35:12.400 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.126s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:35:13 compute-0 nova_compute[187243]: 2025-12-03 00:35:13.147 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:14 compute-0 nova_compute[187243]: 2025-12-03 00:35:14.400 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:14 compute-0 nova_compute[187243]: 2025-12-03 00:35:14.401 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:16 compute-0 nova_compute[187243]: 2025-12-03 00:35:16.540 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:18 compute-0 nova_compute[187243]: 2025-12-03 00:35:18.150 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:20 compute-0 nova_compute[187243]: 2025-12-03 00:35:20.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:21 compute-0 nova_compute[187243]: 2025-12-03 00:35:21.541 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:23 compute-0 nova_compute[187243]: 2025-12-03 00:35:23.153 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:24 compute-0 podman[228228]: 2025-12-03 00:35:24.087401658 +0000 UTC m=+0.049484657 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:35:24 compute-0 podman[228229]: 2025-12-03 00:35:24.098696087 +0000 UTC m=+0.057283130 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Dec 03 00:35:26 compute-0 nova_compute[187243]: 2025-12-03 00:35:26.542 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:28 compute-0 nova_compute[187243]: 2025-12-03 00:35:28.156 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:29 compute-0 podman[197600]: time="2025-12-03T00:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:35:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:35:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2608 "" "Go-http-client/1.1"
Dec 03 00:35:31 compute-0 openstack_network_exporter[199746]: ERROR   00:35:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:35:31 compute-0 openstack_network_exporter[199746]: ERROR   00:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:35:31 compute-0 openstack_network_exporter[199746]: ERROR   00:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:35:31 compute-0 openstack_network_exporter[199746]: ERROR   00:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:35:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:35:31 compute-0 openstack_network_exporter[199746]: ERROR   00:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:35:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:35:31 compute-0 nova_compute[187243]: 2025-12-03 00:35:31.544 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:32 compute-0 podman[228268]: 2025-12-03 00:35:32.097643633 +0000 UTC m=+0.053272411 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:35:33 compute-0 nova_compute[187243]: 2025-12-03 00:35:33.193 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:36 compute-0 podman[228293]: 2025-12-03 00:35:36.114471246 +0000 UTC m=+0.065304309 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 03 00:35:36 compute-0 podman[228294]: 2025-12-03 00:35:36.121035239 +0000 UTC m=+0.076805044 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4)
Dec 03 00:35:36 compute-0 nova_compute[187243]: 2025-12-03 00:35:36.594 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:38 compute-0 nova_compute[187243]: 2025-12-03 00:35:38.197 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:41 compute-0 nova_compute[187243]: 2025-12-03 00:35:41.596 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:43 compute-0 nova_compute[187243]: 2025-12-03 00:35:43.199 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:46 compute-0 sshd-session[228342]: Received disconnect from 45.78.219.213 port 39900:11: Bye Bye [preauth]
Dec 03 00:35:46 compute-0 sshd-session[228342]: Disconnected from authenticating user root 45.78.219.213 port 39900 [preauth]
Dec 03 00:35:46 compute-0 nova_compute[187243]: 2025-12-03 00:35:46.597 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:48 compute-0 nova_compute[187243]: 2025-12-03 00:35:48.202 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:51 compute-0 nova_compute[187243]: 2025-12-03 00:35:51.647 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:53 compute-0 nova_compute[187243]: 2025-12-03 00:35:53.205 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:55 compute-0 podman[228345]: 2025-12-03 00:35:55.097121479 +0000 UTC m=+0.051157217 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 03 00:35:55 compute-0 podman[228344]: 2025-12-03 00:35:55.097241632 +0000 UTC m=+0.055263839 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:35:56 compute-0 nova_compute[187243]: 2025-12-03 00:35:56.649 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:58 compute-0 nova_compute[187243]: 2025-12-03 00:35:58.208 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:35:59 compute-0 nova_compute[187243]: 2025-12-03 00:35:59.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:35:59 compute-0 podman[197600]: time="2025-12-03T00:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:35:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:35:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2606 "" "Go-http-client/1.1"
Dec 03 00:36:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:36:00.747 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:36:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:36:00.747 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:36:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:36:00.747 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:36:01 compute-0 openstack_network_exporter[199746]: ERROR   00:36:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:36:01 compute-0 openstack_network_exporter[199746]: ERROR   00:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:36:01 compute-0 openstack_network_exporter[199746]: ERROR   00:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:36:01 compute-0 openstack_network_exporter[199746]: ERROR   00:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:36:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:36:01 compute-0 openstack_network_exporter[199746]: ERROR   00:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:36:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:36:01 compute-0 nova_compute[187243]: 2025-12-03 00:36:01.650 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:03 compute-0 podman[228385]: 2025-12-03 00:36:03.093917943 +0000 UTC m=+0.056480601 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:36:03 compute-0 nova_compute[187243]: 2025-12-03 00:36:03.210 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:03 compute-0 nova_compute[187243]: 2025-12-03 00:36:03.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:04 compute-0 nova_compute[187243]: 2025-12-03 00:36:04.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:05 compute-0 nova_compute[187243]: 2025-12-03 00:36:05.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:05 compute-0 nova_compute[187243]: 2025-12-03 00:36:05.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:36:06 compute-0 nova_compute[187243]: 2025-12-03 00:36:06.652 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:07 compute-0 podman[228409]: 2025-12-03 00:36:07.100259516 +0000 UTC m=+0.058044050 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:36:07 compute-0 podman[228410]: 2025-12-03 00:36:07.127525251 +0000 UTC m=+0.081981492 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:36:07 compute-0 nova_compute[187243]: 2025-12-03 00:36:07.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:08 compute-0 nova_compute[187243]: 2025-12-03 00:36:08.212 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:10 compute-0 nova_compute[187243]: 2025-12-03 00:36:10.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:11 compute-0 nova_compute[187243]: 2025-12-03 00:36:11.115 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:36:11 compute-0 nova_compute[187243]: 2025-12-03 00:36:11.115 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:36:11 compute-0 nova_compute[187243]: 2025-12-03 00:36:11.116 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:36:11 compute-0 nova_compute[187243]: 2025-12-03 00:36:11.116 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:36:11 compute-0 nova_compute[187243]: 2025-12-03 00:36:11.256 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:36:11 compute-0 nova_compute[187243]: 2025-12-03 00:36:11.257 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:36:11 compute-0 nova_compute[187243]: 2025-12-03 00:36:11.274 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:36:11 compute-0 nova_compute[187243]: 2025-12-03 00:36:11.275 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5801MB free_disk=73.16093444824219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:36:11 compute-0 nova_compute[187243]: 2025-12-03 00:36:11.275 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:36:11 compute-0 nova_compute[187243]: 2025-12-03 00:36:11.275 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:36:11 compute-0 nova_compute[187243]: 2025-12-03 00:36:11.654 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:12 compute-0 nova_compute[187243]: 2025-12-03 00:36:12.319 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:36:12 compute-0 nova_compute[187243]: 2025-12-03 00:36:12.319 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:36:11 up  1:44,  0 user,  load average: 0.01, 0.13, 0.19\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:36:12 compute-0 nova_compute[187243]: 2025-12-03 00:36:12.342 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:36:12 compute-0 nova_compute[187243]: 2025-12-03 00:36:12.848 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:36:13 compute-0 nova_compute[187243]: 2025-12-03 00:36:13.215 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:13 compute-0 nova_compute[187243]: 2025-12-03 00:36:13.365 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:36:13 compute-0 nova_compute[187243]: 2025-12-03 00:36:13.365 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:36:14 compute-0 nova_compute[187243]: 2025-12-03 00:36:14.365 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:15 compute-0 nova_compute[187243]: 2025-12-03 00:36:15.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:16 compute-0 nova_compute[187243]: 2025-12-03 00:36:16.656 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:18 compute-0 nova_compute[187243]: 2025-12-03 00:36:18.217 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:21 compute-0 nova_compute[187243]: 2025-12-03 00:36:21.659 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:23 compute-0 nova_compute[187243]: 2025-12-03 00:36:23.264 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:26 compute-0 podman[228455]: 2025-12-03 00:36:26.094401896 +0000 UTC m=+0.050279787 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 03 00:36:26 compute-0 podman[228454]: 2025-12-03 00:36:26.105367397 +0000 UTC m=+0.063294409 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 03 00:36:26 compute-0 nova_compute[187243]: 2025-12-03 00:36:26.660 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:28 compute-0 nova_compute[187243]: 2025-12-03 00:36:28.268 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:29 compute-0 podman[197600]: time="2025-12-03T00:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:36:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:36:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2606 "" "Go-http-client/1.1"
Dec 03 00:36:31 compute-0 openstack_network_exporter[199746]: ERROR   00:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:36:31 compute-0 openstack_network_exporter[199746]: ERROR   00:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:36:31 compute-0 openstack_network_exporter[199746]: ERROR   00:36:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:36:31 compute-0 openstack_network_exporter[199746]: ERROR   00:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:36:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:36:31 compute-0 openstack_network_exporter[199746]: ERROR   00:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:36:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:36:31 compute-0 nova_compute[187243]: 2025-12-03 00:36:31.662 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:33 compute-0 nova_compute[187243]: 2025-12-03 00:36:33.271 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:34 compute-0 podman[228497]: 2025-12-03 00:36:34.126506552 +0000 UTC m=+0.084701419 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:36:36 compute-0 nova_compute[187243]: 2025-12-03 00:36:36.665 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:36 compute-0 sshd-session[228495]: Invalid user syncuser from 45.78.219.95 port 48430
Dec 03 00:36:37 compute-0 sshd-session[228495]: Received disconnect from 45.78.219.95 port 48430:11: Bye Bye [preauth]
Dec 03 00:36:37 compute-0 sshd-session[228495]: Disconnected from invalid user syncuser 45.78.219.95 port 48430 [preauth]
Dec 03 00:36:38 compute-0 podman[228523]: 2025-12-03 00:36:38.093309945 +0000 UTC m=+0.049932658 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 03 00:36:38 compute-0 podman[228524]: 2025-12-03 00:36:38.118667343 +0000 UTC m=+0.072307853 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 03 00:36:38 compute-0 nova_compute[187243]: 2025-12-03 00:36:38.272 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:41 compute-0 nova_compute[187243]: 2025-12-03 00:36:41.702 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:43 compute-0 nova_compute[187243]: 2025-12-03 00:36:43.275 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:46 compute-0 nova_compute[187243]: 2025-12-03 00:36:46.703 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:48 compute-0 nova_compute[187243]: 2025-12-03 00:36:48.278 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:51 compute-0 nova_compute[187243]: 2025-12-03 00:36:51.738 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:53 compute-0 nova_compute[187243]: 2025-12-03 00:36:53.281 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:56 compute-0 nova_compute[187243]: 2025-12-03 00:36:56.740 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:56 compute-0 podman[228564]: 2025-12-03 00:36:56.846378001 +0000 UTC m=+0.069326550 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:36:56 compute-0 podman[228565]: 2025-12-03 00:36:56.846379142 +0000 UTC m=+0.064273294 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, version=9.6, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 03 00:36:58 compute-0 nova_compute[187243]: 2025-12-03 00:36:58.283 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:36:59 compute-0 nova_compute[187243]: 2025-12-03 00:36:59.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:36:59 compute-0 podman[197600]: time="2025-12-03T00:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:36:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:36:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2605 "" "Go-http-client/1.1"
Dec 03 00:37:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:37:00.748 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:37:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:37:00.749 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:37:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:37:00.749 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:37:01 compute-0 openstack_network_exporter[199746]: ERROR   00:37:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:37:01 compute-0 openstack_network_exporter[199746]: ERROR   00:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:37:01 compute-0 openstack_network_exporter[199746]: ERROR   00:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:37:01 compute-0 openstack_network_exporter[199746]: ERROR   00:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:37:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:37:01 compute-0 openstack_network_exporter[199746]: ERROR   00:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:37:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:37:01 compute-0 nova_compute[187243]: 2025-12-03 00:37:01.769 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:03 compute-0 nova_compute[187243]: 2025-12-03 00:37:03.287 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:05 compute-0 podman[228608]: 2025-12-03 00:37:05.101265977 +0000 UTC m=+0.054373728 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:37:05 compute-0 nova_compute[187243]: 2025-12-03 00:37:05.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:06 compute-0 nova_compute[187243]: 2025-12-03 00:37:06.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:06 compute-0 nova_compute[187243]: 2025-12-03 00:37:06.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:06 compute-0 nova_compute[187243]: 2025-12-03 00:37:06.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:37:06 compute-0 nova_compute[187243]: 2025-12-03 00:37:06.771 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:07 compute-0 sshd-session[228606]: Invalid user deploy from 101.47.140.127 port 45760
Dec 03 00:37:07 compute-0 sshd-session[228606]: Received disconnect from 101.47.140.127 port 45760:11: Bye Bye [preauth]
Dec 03 00:37:07 compute-0 sshd-session[228606]: Disconnected from invalid user deploy 101.47.140.127 port 45760 [preauth]
Dec 03 00:37:08 compute-0 nova_compute[187243]: 2025-12-03 00:37:08.289 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:09 compute-0 podman[228631]: 2025-12-03 00:37:09.119419093 +0000 UTC m=+0.078670940 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:37:09 compute-0 podman[228632]: 2025-12-03 00:37:09.149479658 +0000 UTC m=+0.105229588 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 00:37:09 compute-0 nova_compute[187243]: 2025-12-03 00:37:09.589 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:11 compute-0 nova_compute[187243]: 2025-12-03 00:37:11.774 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:12 compute-0 nova_compute[187243]: 2025-12-03 00:37:12.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:13 compute-0 nova_compute[187243]: 2025-12-03 00:37:13.104 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:37:13 compute-0 nova_compute[187243]: 2025-12-03 00:37:13.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:37:13 compute-0 nova_compute[187243]: 2025-12-03 00:37:13.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:37:13 compute-0 nova_compute[187243]: 2025-12-03 00:37:13.105 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:37:13 compute-0 nova_compute[187243]: 2025-12-03 00:37:13.229 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:37:13 compute-0 nova_compute[187243]: 2025-12-03 00:37:13.230 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:37:13 compute-0 nova_compute[187243]: 2025-12-03 00:37:13.246 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:37:13 compute-0 nova_compute[187243]: 2025-12-03 00:37:13.247 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5811MB free_disk=73.16093444824219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:37:13 compute-0 nova_compute[187243]: 2025-12-03 00:37:13.247 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:37:13 compute-0 nova_compute[187243]: 2025-12-03 00:37:13.248 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:37:13 compute-0 nova_compute[187243]: 2025-12-03 00:37:13.292 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:14 compute-0 nova_compute[187243]: 2025-12-03 00:37:14.429 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:37:14 compute-0 nova_compute[187243]: 2025-12-03 00:37:14.430 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:37:13 up  1:45,  0 user,  load average: 0.00, 0.10, 0.18\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:37:14 compute-0 nova_compute[187243]: 2025-12-03 00:37:14.500 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:37:15 compute-0 nova_compute[187243]: 2025-12-03 00:37:15.009 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:37:15 compute-0 nova_compute[187243]: 2025-12-03 00:37:15.550 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:37:15 compute-0 nova_compute[187243]: 2025-12-03 00:37:15.551 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.304s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:37:16 compute-0 nova_compute[187243]: 2025-12-03 00:37:16.552 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:16 compute-0 nova_compute[187243]: 2025-12-03 00:37:16.552 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:16 compute-0 nova_compute[187243]: 2025-12-03 00:37:16.775 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:18 compute-0 nova_compute[187243]: 2025-12-03 00:37:18.296 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:21 compute-0 nova_compute[187243]: 2025-12-03 00:37:21.776 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:22 compute-0 nova_compute[187243]: 2025-12-03 00:37:22.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:23 compute-0 nova_compute[187243]: 2025-12-03 00:37:23.299 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:24 compute-0 sshd-session[228678]: Invalid user dd from 45.78.222.160 port 42812
Dec 03 00:37:25 compute-0 sshd-session[228678]: Received disconnect from 45.78.222.160 port 42812:11: Bye Bye [preauth]
Dec 03 00:37:25 compute-0 sshd-session[228678]: Disconnected from invalid user dd 45.78.222.160 port 42812 [preauth]
Dec 03 00:37:26 compute-0 nova_compute[187243]: 2025-12-03 00:37:26.778 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:27 compute-0 podman[228680]: 2025-12-03 00:37:27.097381207 +0000 UTC m=+0.054904671 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:37:27 compute-0 podman[228681]: 2025-12-03 00:37:27.106239657 +0000 UTC m=+0.059951567 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, release=1755695350, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm)
Dec 03 00:37:28 compute-0 nova_compute[187243]: 2025-12-03 00:37:28.301 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:29 compute-0 podman[197600]: time="2025-12-03T00:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:37:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:37:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2607 "" "Go-http-client/1.1"
Dec 03 00:37:31 compute-0 openstack_network_exporter[199746]: ERROR   00:37:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:37:31 compute-0 openstack_network_exporter[199746]: ERROR   00:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:37:31 compute-0 openstack_network_exporter[199746]: ERROR   00:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:37:31 compute-0 openstack_network_exporter[199746]: ERROR   00:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:37:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:37:31 compute-0 openstack_network_exporter[199746]: ERROR   00:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:37:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:37:31 compute-0 nova_compute[187243]: 2025-12-03 00:37:31.780 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:33 compute-0 nova_compute[187243]: 2025-12-03 00:37:33.354 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:36 compute-0 podman[228721]: 2025-12-03 00:37:36.094241104 +0000 UTC m=+0.054158103 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:37:36 compute-0 nova_compute[187243]: 2025-12-03 00:37:36.782 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:38 compute-0 nova_compute[187243]: 2025-12-03 00:37:38.357 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:40 compute-0 podman[228745]: 2025-12-03 00:37:40.089439571 +0000 UTC m=+0.049078347 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:37:40 compute-0 podman[228746]: 2025-12-03 00:37:40.125397702 +0000 UTC m=+0.077524372 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Dec 03 00:37:41 compute-0 nova_compute[187243]: 2025-12-03 00:37:41.825 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:43 compute-0 nova_compute[187243]: 2025-12-03 00:37:43.360 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:46 compute-0 nova_compute[187243]: 2025-12-03 00:37:46.826 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:48 compute-0 nova_compute[187243]: 2025-12-03 00:37:48.363 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:51 compute-0 nova_compute[187243]: 2025-12-03 00:37:51.827 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:53 compute-0 nova_compute[187243]: 2025-12-03 00:37:53.366 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:56 compute-0 nova_compute[187243]: 2025-12-03 00:37:56.832 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:58 compute-0 podman[228792]: 2025-12-03 00:37:58.108711468 +0000 UTC m=+0.063512864 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2)
Dec 03 00:37:58 compute-0 podman[228793]: 2025-12-03 00:37:58.112781149 +0000 UTC m=+0.063053163 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 03 00:37:58 compute-0 nova_compute[187243]: 2025-12-03 00:37:58.370 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:37:59 compute-0 nova_compute[187243]: 2025-12-03 00:37:59.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:37:59 compute-0 podman[197600]: time="2025-12-03T00:37:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:37:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:37:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:37:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:37:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2605 "" "Go-http-client/1.1"
Dec 03 00:38:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:38:00.750 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:38:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:38:00.750 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:38:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:38:00.750 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:38:01 compute-0 openstack_network_exporter[199746]: ERROR   00:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:38:01 compute-0 openstack_network_exporter[199746]: ERROR   00:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:38:01 compute-0 openstack_network_exporter[199746]: ERROR   00:38:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:38:01 compute-0 openstack_network_exporter[199746]: ERROR   00:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:38:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:38:01 compute-0 openstack_network_exporter[199746]: ERROR   00:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:38:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:38:01 compute-0 nova_compute[187243]: 2025-12-03 00:38:01.834 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:03 compute-0 nova_compute[187243]: 2025-12-03 00:38:03.376 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:05 compute-0 nova_compute[187243]: 2025-12-03 00:38:05.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:06 compute-0 nova_compute[187243]: 2025-12-03 00:38:06.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:06 compute-0 nova_compute[187243]: 2025-12-03 00:38:06.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:38:06 compute-0 nova_compute[187243]: 2025-12-03 00:38:06.835 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:07 compute-0 podman[228834]: 2025-12-03 00:38:07.115529473 +0000 UTC m=+0.077679176 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:38:08 compute-0 nova_compute[187243]: 2025-12-03 00:38:08.405 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:08 compute-0 nova_compute[187243]: 2025-12-03 00:38:08.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:09 compute-0 nova_compute[187243]: 2025-12-03 00:38:09.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:09 compute-0 nova_compute[187243]: 2025-12-03 00:38:09.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 03 00:38:11 compute-0 podman[228859]: 2025-12-03 00:38:11.095130923 +0000 UTC m=+0.056536090 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true)
Dec 03 00:38:11 compute-0 podman[228860]: 2025-12-03 00:38:11.148190778 +0000 UTC m=+0.105046572 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 03 00:38:11 compute-0 nova_compute[187243]: 2025-12-03 00:38:11.861 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:12 compute-0 nova_compute[187243]: 2025-12-03 00:38:12.105 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:13 compute-0 nova_compute[187243]: 2025-12-03 00:38:13.408 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:13 compute-0 nova_compute[187243]: 2025-12-03 00:38:13.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:14 compute-0 nova_compute[187243]: 2025-12-03 00:38:14.104 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:38:14 compute-0 nova_compute[187243]: 2025-12-03 00:38:14.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:38:14 compute-0 nova_compute[187243]: 2025-12-03 00:38:14.105 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:38:14 compute-0 nova_compute[187243]: 2025-12-03 00:38:14.105 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:38:14 compute-0 nova_compute[187243]: 2025-12-03 00:38:14.223 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:38:14 compute-0 nova_compute[187243]: 2025-12-03 00:38:14.224 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:38:14 compute-0 nova_compute[187243]: 2025-12-03 00:38:14.238 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:38:14 compute-0 nova_compute[187243]: 2025-12-03 00:38:14.239 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5802MB free_disk=73.16087341308594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:38:14 compute-0 nova_compute[187243]: 2025-12-03 00:38:14.239 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:38:14 compute-0 nova_compute[187243]: 2025-12-03 00:38:14.239 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:38:15 compute-0 nova_compute[187243]: 2025-12-03 00:38:15.287 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:38:15 compute-0 nova_compute[187243]: 2025-12-03 00:38:15.288 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:38:14 up  1:46,  0 user,  load average: 0.04, 0.10, 0.17\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:38:15 compute-0 nova_compute[187243]: 2025-12-03 00:38:15.317 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing inventories for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 03 00:38:15 compute-0 nova_compute[187243]: 2025-12-03 00:38:15.342 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating ProviderTree inventory for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 03 00:38:15 compute-0 nova_compute[187243]: 2025-12-03 00:38:15.342 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Updating inventory in ProviderTree for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 03 00:38:15 compute-0 nova_compute[187243]: 2025-12-03 00:38:15.356 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing aggregate associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 03 00:38:15 compute-0 nova_compute[187243]: 2025-12-03 00:38:15.387 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Refreshing trait associations for resource provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_SOUND_MODEL_AC97,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOUND_MODEL_ICH9,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 03 00:38:15 compute-0 nova_compute[187243]: 2025-12-03 00:38:15.418 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:38:15 compute-0 nova_compute[187243]: 2025-12-03 00:38:15.937 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:38:16 compute-0 nova_compute[187243]: 2025-12-03 00:38:16.446 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:38:16 compute-0 nova_compute[187243]: 2025-12-03 00:38:16.447 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.208s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:38:16 compute-0 nova_compute[187243]: 2025-12-03 00:38:16.863 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:17 compute-0 nova_compute[187243]: 2025-12-03 00:38:17.448 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:17 compute-0 nova_compute[187243]: 2025-12-03 00:38:17.448 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:18 compute-0 nova_compute[187243]: 2025-12-03 00:38:18.411 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:18 compute-0 nova_compute[187243]: 2025-12-03 00:38:18.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:18 compute-0 nova_compute[187243]: 2025-12-03 00:38:18.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 03 00:38:19 compute-0 nova_compute[187243]: 2025-12-03 00:38:19.097 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 03 00:38:21 compute-0 nova_compute[187243]: 2025-12-03 00:38:21.864 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:23 compute-0 nova_compute[187243]: 2025-12-03 00:38:23.413 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:26 compute-0 nova_compute[187243]: 2025-12-03 00:38:26.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:38:26 compute-0 nova_compute[187243]: 2025-12-03 00:38:26.867 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:28 compute-0 nova_compute[187243]: 2025-12-03 00:38:28.417 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:29 compute-0 podman[228905]: 2025-12-03 00:38:29.091491954 +0000 UTC m=+0.049957829 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 03 00:38:29 compute-0 podman[228906]: 2025-12-03 00:38:29.105506411 +0000 UTC m=+0.057063534 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, version=9.6, release=1755695350, io.openshift.expose-services=, name=ubi9-minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 03 00:38:29 compute-0 podman[197600]: time="2025-12-03T00:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:38:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:38:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2604 "" "Go-http-client/1.1"
Dec 03 00:38:31 compute-0 openstack_network_exporter[199746]: ERROR   00:38:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:38:31 compute-0 openstack_network_exporter[199746]: ERROR   00:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:38:31 compute-0 openstack_network_exporter[199746]: ERROR   00:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:38:31 compute-0 openstack_network_exporter[199746]: ERROR   00:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:38:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:38:31 compute-0 openstack_network_exporter[199746]: ERROR   00:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:38:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:38:31 compute-0 nova_compute[187243]: 2025-12-03 00:38:31.904 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:33 compute-0 nova_compute[187243]: 2025-12-03 00:38:33.420 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:36 compute-0 nova_compute[187243]: 2025-12-03 00:38:36.906 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:38 compute-0 podman[228948]: 2025-12-03 00:38:38.097648163 +0000 UTC m=+0.047615442 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 03 00:38:38 compute-0 nova_compute[187243]: 2025-12-03 00:38:38.422 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:41 compute-0 nova_compute[187243]: 2025-12-03 00:38:41.950 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:42 compute-0 podman[228971]: 2025-12-03 00:38:42.126918583 +0000 UTC m=+0.082324991 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=watcher_latest, tcib_managed=true)
Dec 03 00:38:42 compute-0 podman[228972]: 2025-12-03 00:38:42.14373663 +0000 UTC m=+0.096639986 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 03 00:38:43 compute-0 nova_compute[187243]: 2025-12-03 00:38:43.424 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:46 compute-0 nova_compute[187243]: 2025-12-03 00:38:46.952 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:48 compute-0 nova_compute[187243]: 2025-12-03 00:38:48.427 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:52 compute-0 nova_compute[187243]: 2025-12-03 00:38:52.002 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:53 compute-0 nova_compute[187243]: 2025-12-03 00:38:53.429 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:57 compute-0 nova_compute[187243]: 2025-12-03 00:38:57.005 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:58 compute-0 nova_compute[187243]: 2025-12-03 00:38:58.431 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:38:59 compute-0 podman[197600]: time="2025-12-03T00:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:38:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:38:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Dec 03 00:39:00 compute-0 podman[229015]: 2025-12-03 00:39:00.097212518 +0000 UTC m=+0.052514562 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=multipathd)
Dec 03 00:39:00 compute-0 podman[229016]: 2025-12-03 00:39:00.121964601 +0000 UTC m=+0.075123192 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, name=ubi9-minimal, vcs-type=git)
Dec 03 00:39:00 compute-0 nova_compute[187243]: 2025-12-03 00:39:00.128 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:39:00.751 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:39:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:39:00.751 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:39:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:39:00.751 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:39:01 compute-0 openstack_network_exporter[199746]: ERROR   00:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:39:01 compute-0 openstack_network_exporter[199746]: ERROR   00:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:39:01 compute-0 openstack_network_exporter[199746]: ERROR   00:39:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:39:01 compute-0 openstack_network_exporter[199746]: ERROR   00:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:39:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:39:01 compute-0 openstack_network_exporter[199746]: ERROR   00:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:39:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:39:02 compute-0 nova_compute[187243]: 2025-12-03 00:39:02.007 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:03 compute-0 nova_compute[187243]: 2025-12-03 00:39:03.434 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:07 compute-0 nova_compute[187243]: 2025-12-03 00:39:07.009 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:07 compute-0 nova_compute[187243]: 2025-12-03 00:39:07.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:08 compute-0 nova_compute[187243]: 2025-12-03 00:39:08.436 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:08 compute-0 nova_compute[187243]: 2025-12-03 00:39:08.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:08 compute-0 nova_compute[187243]: 2025-12-03 00:39:08.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:08 compute-0 nova_compute[187243]: 2025-12-03 00:39:08.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:39:09 compute-0 podman[229055]: 2025-12-03 00:39:09.148286209 +0000 UTC m=+0.086619567 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 03 00:39:12 compute-0 nova_compute[187243]: 2025-12-03 00:39:12.010 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:13 compute-0 podman[229081]: 2025-12-03 00:39:13.118127738 +0000 UTC m=+0.070961159 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 03 00:39:13 compute-0 podman[229082]: 2025-12-03 00:39:13.245958334 +0000 UTC m=+0.192100410 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:39:13 compute-0 nova_compute[187243]: 2025-12-03 00:39:13.437 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:13 compute-0 nova_compute[187243]: 2025-12-03 00:39:13.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:15 compute-0 nova_compute[187243]: 2025-12-03 00:39:15.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:16 compute-0 nova_compute[187243]: 2025-12-03 00:39:16.103 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:39:16 compute-0 nova_compute[187243]: 2025-12-03 00:39:16.104 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:39:16 compute-0 nova_compute[187243]: 2025-12-03 00:39:16.104 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:39:16 compute-0 nova_compute[187243]: 2025-12-03 00:39:16.104 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:39:16 compute-0 nova_compute[187243]: 2025-12-03 00:39:16.253 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:39:16 compute-0 nova_compute[187243]: 2025-12-03 00:39:16.255 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:39:16 compute-0 nova_compute[187243]: 2025-12-03 00:39:16.271 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:39:16 compute-0 nova_compute[187243]: 2025-12-03 00:39:16.272 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5807MB free_disk=73.15696716308594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:39:16 compute-0 nova_compute[187243]: 2025-12-03 00:39:16.272 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:39:16 compute-0 nova_compute[187243]: 2025-12-03 00:39:16.273 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:39:17 compute-0 nova_compute[187243]: 2025-12-03 00:39:17.012 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:17 compute-0 nova_compute[187243]: 2025-12-03 00:39:17.374 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:39:17 compute-0 nova_compute[187243]: 2025-12-03 00:39:17.375 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:39:16 up  1:47,  0 user,  load average: 0.01, 0.07, 0.16\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:39:17 compute-0 nova_compute[187243]: 2025-12-03 00:39:17.393 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:39:17 compute-0 nova_compute[187243]: 2025-12-03 00:39:17.899 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:39:18 compute-0 nova_compute[187243]: 2025-12-03 00:39:18.410 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:39:18 compute-0 nova_compute[187243]: 2025-12-03 00:39:18.410 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.138s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:39:18 compute-0 nova_compute[187243]: 2025-12-03 00:39:18.440 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:19 compute-0 nova_compute[187243]: 2025-12-03 00:39:19.410 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:19 compute-0 nova_compute[187243]: 2025-12-03 00:39:19.411 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:22 compute-0 nova_compute[187243]: 2025-12-03 00:39:22.014 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:22 compute-0 nova_compute[187243]: 2025-12-03 00:39:22.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:39:23 compute-0 nova_compute[187243]: 2025-12-03 00:39:23.443 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:27 compute-0 nova_compute[187243]: 2025-12-03 00:39:27.015 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:28 compute-0 nova_compute[187243]: 2025-12-03 00:39:28.446 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:29 compute-0 podman[197600]: time="2025-12-03T00:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:39:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:39:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2605 "" "Go-http-client/1.1"
Dec 03 00:39:30 compute-0 sshd-session[229127]: Received disconnect from 45.78.219.95 port 34460:11: Bye Bye [preauth]
Dec 03 00:39:30 compute-0 sshd-session[229127]: Disconnected from 45.78.219.95 port 34460 [preauth]
Dec 03 00:39:31 compute-0 podman[229130]: 2025-12-03 00:39:31.090946805 +0000 UTC m=+0.050338418 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 03 00:39:31 compute-0 podman[229131]: 2025-12-03 00:39:31.091495129 +0000 UTC m=+0.046893963 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Dec 03 00:39:31 compute-0 openstack_network_exporter[199746]: ERROR   00:39:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:39:31 compute-0 openstack_network_exporter[199746]: ERROR   00:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:39:31 compute-0 openstack_network_exporter[199746]: ERROR   00:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:39:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:39:31 compute-0 openstack_network_exporter[199746]: ERROR   00:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:39:31 compute-0 openstack_network_exporter[199746]: ERROR   00:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:39:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:39:32 compute-0 nova_compute[187243]: 2025-12-03 00:39:32.016 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:33 compute-0 nova_compute[187243]: 2025-12-03 00:39:33.449 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:37 compute-0 nova_compute[187243]: 2025-12-03 00:39:37.017 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:38 compute-0 nova_compute[187243]: 2025-12-03 00:39:38.453 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:40 compute-0 podman[229167]: 2025-12-03 00:39:40.106602349 +0000 UTC m=+0.064295715 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 03 00:39:42 compute-0 nova_compute[187243]: 2025-12-03 00:39:42.021 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:43 compute-0 nova_compute[187243]: 2025-12-03 00:39:43.455 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:44 compute-0 podman[229192]: 2025-12-03 00:39:44.09475725 +0000 UTC m=+0.056831879 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 03 00:39:44 compute-0 podman[229193]: 2025-12-03 00:39:44.133572011 +0000 UTC m=+0.084970275 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 03 00:39:47 compute-0 nova_compute[187243]: 2025-12-03 00:39:47.022 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:48 compute-0 sshd-session[229190]: Invalid user webuser from 45.78.222.160 port 38880
Dec 03 00:39:48 compute-0 nova_compute[187243]: 2025-12-03 00:39:48.456 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:48 compute-0 sshd-session[229190]: Received disconnect from 45.78.222.160 port 38880:11: Bye Bye [preauth]
Dec 03 00:39:48 compute-0 sshd-session[229190]: Disconnected from invalid user webuser 45.78.222.160 port 38880 [preauth]
Dec 03 00:39:52 compute-0 nova_compute[187243]: 2025-12-03 00:39:52.023 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:53 compute-0 nova_compute[187243]: 2025-12-03 00:39:53.459 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:57 compute-0 nova_compute[187243]: 2025-12-03 00:39:57.025 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:58 compute-0 nova_compute[187243]: 2025-12-03 00:39:58.461 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:39:59 compute-0 podman[197600]: time="2025-12-03T00:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:39:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:39:59 compute-0 podman[197600]: @ - - [03/Dec/2025:00:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2608 "" "Go-http-client/1.1"
Dec 03 00:40:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:40:00.752 104379 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:40:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:40:00.752 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:40:00 compute-0 ovn_metadata_agent[104374]: 2025-12-03 00:40:00.753 104379 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:40:01 compute-0 openstack_network_exporter[199746]: ERROR   00:40:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:40:01 compute-0 openstack_network_exporter[199746]: ERROR   00:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:40:01 compute-0 openstack_network_exporter[199746]: ERROR   00:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:40:01 compute-0 openstack_network_exporter[199746]: ERROR   00:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:40:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:40:01 compute-0 openstack_network_exporter[199746]: ERROR   00:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:40:01 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:40:01 compute-0 nova_compute[187243]: 2025-12-03 00:40:01.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:02 compute-0 nova_compute[187243]: 2025-12-03 00:40:02.027 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:02 compute-0 podman[229240]: 2025-12-03 00:40:02.103510137 +0000 UTC m=+0.051510917 container health_status b3e97224212d0bb2fb1c1f04e7f02cc9fef1a4de279f54f438c14f6d466bbb84 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container)
Dec 03 00:40:02 compute-0 podman[229239]: 2025-12-03 00:40:02.115296209 +0000 UTC m=+0.063888874 container health_status a70531a46c5ad82927881332c62620768e0ac1f2c4a2f00f31dd325c8e1c965a (image=38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 00:40:03 compute-0 nova_compute[187243]: 2025-12-03 00:40:03.463 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:07 compute-0 nova_compute[187243]: 2025-12-03 00:40:07.029 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:08 compute-0 nova_compute[187243]: 2025-12-03 00:40:08.467 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:09 compute-0 nova_compute[187243]: 2025-12-03 00:40:09.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:09 compute-0 nova_compute[187243]: 2025-12-03 00:40:09.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:09 compute-0 nova_compute[187243]: 2025-12-03 00:40:09.592 187247 DEBUG nova.compute.manager [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 03 00:40:10 compute-0 nova_compute[187243]: 2025-12-03 00:40:10.592 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:11 compute-0 podman[229280]: 2025-12-03 00:40:11.107774628 +0000 UTC m=+0.061850394 container health_status 28ef044ece0b17b4832958190c1543a641cf6b21a67fbfa6a53cd18ad7ec793d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 03 00:40:12 compute-0 nova_compute[187243]: 2025-12-03 00:40:12.032 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:13 compute-0 nova_compute[187243]: 2025-12-03 00:40:13.521 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:13 compute-0 nova_compute[187243]: 2025-12-03 00:40:13.588 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:15 compute-0 podman[229304]: 2025-12-03 00:40:15.103617301 +0000 UTC m=+0.061769882 container health_status 282390f019f490eecc433f388a2ec50f2fbd2631e85b029febec037f36f19b41 (image=38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 03 00:40:15 compute-0 podman[229305]: 2025-12-03 00:40:15.163578756 +0000 UTC m=+0.118628720 container health_status e5e6c47d93c2a1fea72714e152f931749ef43f0bf1a7793b2d45a96d8fb6daa6 (image=38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.2:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Dec 03 00:40:15 compute-0 sshd-session[229349]: Connection closed by authenticating user root 143.198.96.196 port 47206 [preauth]
Dec 03 00:40:17 compute-0 nova_compute[187243]: 2025-12-03 00:40:17.032 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:17 compute-0 nova_compute[187243]: 2025-12-03 00:40:17.591 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:18 compute-0 nova_compute[187243]: 2025-12-03 00:40:18.103 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:40:18 compute-0 nova_compute[187243]: 2025-12-03 00:40:18.103 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:40:18 compute-0 nova_compute[187243]: 2025-12-03 00:40:18.103 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:40:18 compute-0 nova_compute[187243]: 2025-12-03 00:40:18.104 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 03 00:40:18 compute-0 nova_compute[187243]: 2025-12-03 00:40:18.220 187247 WARNING nova.virt.libvirt.driver [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 00:40:18 compute-0 nova_compute[187243]: 2025-12-03 00:40:18.221 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 03 00:40:18 compute-0 nova_compute[187243]: 2025-12-03 00:40:18.237 187247 DEBUG oslo_concurrency.processutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 03 00:40:18 compute-0 nova_compute[187243]: 2025-12-03 00:40:18.237 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5813MB free_disk=73.15696716308594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 03 00:40:18 compute-0 nova_compute[187243]: 2025-12-03 00:40:18.238 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 03 00:40:18 compute-0 nova_compute[187243]: 2025-12-03 00:40:18.238 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 03 00:40:18 compute-0 nova_compute[187243]: 2025-12-03 00:40:18.525 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:18 compute-0 sshd-session[229352]: Accepted publickey for zuul from 192.168.122.10 port 47198 ssh2: ECDSA SHA256:gFecZz+/piDVvdr3RfBVEzuXn/qY+k/W7mJ58IKimrM
Dec 03 00:40:18 compute-0 systemd-logind[795]: New session 31 of user zuul.
Dec 03 00:40:18 compute-0 systemd[1]: Started Session 31 of User zuul.
Dec 03 00:40:18 compute-0 sshd-session[229352]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 00:40:19 compute-0 sudo[229356]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 03 00:40:19 compute-0 sudo[229356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 00:40:19 compute-0 nova_compute[187243]: 2025-12-03 00:40:19.876 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 03 00:40:19 compute-0 nova_compute[187243]: 2025-12-03 00:40:19.877 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 00:40:18 up  1:48,  0 user,  load average: 0.00, 0.06, 0.15\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 03 00:40:19 compute-0 nova_compute[187243]: 2025-12-03 00:40:19.951 187247 DEBUG nova.compute.provider_tree [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed in ProviderTree for provider: 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 03 00:40:20 compute-0 nova_compute[187243]: 2025-12-03 00:40:20.459 187247 DEBUG nova.scheduler.client.report [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Inventory has not changed for provider 0d6e1fe8-f800-4b94-a0c0-ea75083d5248 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 03 00:40:20 compute-0 nova_compute[187243]: 2025-12-03 00:40:20.976 187247 DEBUG nova.compute.resource_tracker [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 03 00:40:20 compute-0 nova_compute[187243]: 2025-12-03 00:40:20.977 187247 DEBUG oslo_concurrency.lockutils [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.739s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 03 00:40:21 compute-0 nova_compute[187243]: 2025-12-03 00:40:21.977 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:21 compute-0 nova_compute[187243]: 2025-12-03 00:40:21.979 187247 DEBUG oslo_service.periodic_task [None req-85467988-0a62-4bed-a17c-445bb2342e9d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 03 00:40:22 compute-0 nova_compute[187243]: 2025-12-03 00:40:22.034 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:23 compute-0 ovs-vsctl[229527]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 03 00:40:23 compute-0 nova_compute[187243]: 2025-12-03 00:40:23.527 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:24 compute-0 virtqemud[186944]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 03 00:40:24 compute-0 virtqemud[186944]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 03 00:40:24 compute-0 virtqemud[186944]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 03 00:40:25 compute-0 crontab[229940]: (root) LIST (root)
Dec 03 00:40:27 compute-0 nova_compute[187243]: 2025-12-03 00:40:27.035 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:27 compute-0 systemd[1]: Starting Hostname Service...
Dec 03 00:40:27 compute-0 systemd[1]: Started Hostname Service.
Dec 03 00:40:28 compute-0 nova_compute[187243]: 2025-12-03 00:40:28.529 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 03 00:40:29 compute-0 podman[197600]: time="2025-12-03T00:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 03 00:40:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17066 "" "Go-http-client/1.1"
Dec 03 00:40:29 compute-0 podman[197600]: @ - - [03/Dec/2025:00:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Dec 03 00:40:31 compute-0 openstack_network_exporter[199746]: ERROR   00:40:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 03 00:40:31 compute-0 openstack_network_exporter[199746]: ERROR   00:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:40:31 compute-0 openstack_network_exporter[199746]: ERROR   00:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 03 00:40:31 compute-0 openstack_network_exporter[199746]: ERROR   00:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 03 00:40:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:40:31 compute-0 openstack_network_exporter[199746]: ERROR   00:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 03 00:40:31 compute-0 openstack_network_exporter[199746]: 
Dec 03 00:40:32 compute-0 nova_compute[187243]: 2025-12-03 00:40:32.037 187247 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
